Feb 15 17:05:42 crc systemd[1]: Starting Kubernetes Kubelet... Feb 15 17:05:42 crc restorecon[4580]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:42 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 15 17:05:43 crc restorecon[4580]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 15 17:05:44 crc kubenswrapper[4585]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 15 17:05:44 crc kubenswrapper[4585]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 15 17:05:44 crc kubenswrapper[4585]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 15 17:05:44 crc kubenswrapper[4585]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 15 17:05:44 crc kubenswrapper[4585]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 15 17:05:44 crc kubenswrapper[4585]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.536999 4585 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543013 4585 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543053 4585 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543066 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543077 4585 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543088 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543098 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543107 4585 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543115 4585 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543122 4585 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543131 4585 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543141 4585 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543154 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543164 4585 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543176 4585 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543186 4585 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543199 4585 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543211 4585 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543222 4585 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543232 4585 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543241 4585 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543250 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543260 4585 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543272 4585 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543281 4585 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543291 4585 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543300 4585 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543310 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543338 4585 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543349 4585 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543359 4585 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543368 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543378 4585 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543387 4585 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543399 4585 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543409 4585 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543419 4585 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543429 4585 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543439 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543449 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543459 4585 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543492 4585 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543508 4585 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543521 4585 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543534 4585 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543547 4585 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543555 4585 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543563 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543573 4585 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543583 4585 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543593 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543634 4585 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543642 4585 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543652 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543662 4585 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543671 4585 feature_gate.go:330] unrecognized feature gate: Example Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543679 4585 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543686 4585 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543694 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543702 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543709 4585 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543717 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543725 4585 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543732 4585 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543740 4585 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543748 4585 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543756 4585 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543767 4585 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543774 4585 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543782 4585 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543790 4585 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.543798 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.543961 4585 flags.go:64] FLAG: --address="0.0.0.0" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.543978 4585 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.543993 4585 flags.go:64] FLAG: --anonymous-auth="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544005 4585 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544017 4585 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544026 4585 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544038 4585 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544053 4585 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544062 4585 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544072 4585 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544082 4585 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544093 4585 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544102 4585 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544111 4585 flags.go:64] FLAG: --cgroup-root="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544121 4585 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544131 4585 flags.go:64] FLAG: --client-ca-file="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544140 4585 flags.go:64] FLAG: --cloud-config="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544149 4585 flags.go:64] FLAG: --cloud-provider="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544158 4585 flags.go:64] FLAG: --cluster-dns="[]" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544170 4585 flags.go:64] FLAG: --cluster-domain="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544180 4585 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544189 4585 flags.go:64] FLAG: --config-dir="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544198 4585 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544208 4585 flags.go:64] FLAG: --container-log-max-files="5" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544220 4585 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544229 4585 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544238 4585 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544247 4585 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544256 4585 flags.go:64] FLAG: --contention-profiling="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544265 4585 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544275 4585 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544285 4585 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544293 4585 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544304 4585 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544313 4585 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544322 4585 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544331 4585 flags.go:64] FLAG: --enable-load-reader="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544340 4585 flags.go:64] FLAG: --enable-server="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544349 4585 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544360 4585 flags.go:64] FLAG: --event-burst="100" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544369 4585 flags.go:64] FLAG: --event-qps="50" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544379 4585 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544388 4585 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544398 4585 flags.go:64] FLAG: --eviction-hard="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544409 4585 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544418 4585 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544430 4585 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544440 4585 flags.go:64] FLAG: --eviction-soft="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544449 4585 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544457 4585 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544466 4585 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544475 4585 flags.go:64] FLAG: --experimental-mounter-path="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544484 4585 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544493 4585 flags.go:64] FLAG: --fail-swap-on="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544502 4585 flags.go:64] FLAG: --feature-gates="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544513 4585 flags.go:64] FLAG: --file-check-frequency="20s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544522 4585 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544532 4585 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544541 4585 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544550 4585 flags.go:64] FLAG: --healthz-port="10248" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544559 4585 flags.go:64] FLAG: --help="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544568 4585 flags.go:64] FLAG: --hostname-override="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544577 4585 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544586 4585 flags.go:64] FLAG: --http-check-frequency="20s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544594 4585 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544631 4585 flags.go:64] FLAG: --image-credential-provider-config="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544640 4585 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544649 4585 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544657 4585 flags.go:64] FLAG: --image-service-endpoint="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544666 4585 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544675 4585 flags.go:64] FLAG: --kube-api-burst="100" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544684 4585 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544694 4585 flags.go:64] FLAG: --kube-api-qps="50" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544702 4585 flags.go:64] FLAG: --kube-reserved="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544712 4585 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544720 4585 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544729 4585 flags.go:64] FLAG: --kubelet-cgroups="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544739 4585 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544748 4585 flags.go:64] FLAG: --lock-file="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544758 4585 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544768 4585 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544776 4585 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544790 4585 flags.go:64] FLAG: --log-json-split-stream="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544800 4585 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544809 4585 flags.go:64] FLAG: --log-text-split-stream="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544818 4585 flags.go:64] FLAG: --logging-format="text" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544826 4585 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544836 4585 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544845 4585 flags.go:64] FLAG: --manifest-url="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544854 4585 flags.go:64] FLAG: --manifest-url-header="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544866 4585 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544875 4585 flags.go:64] FLAG: --max-open-files="1000000" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544890 4585 flags.go:64] FLAG: --max-pods="110" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544903 4585 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544915 4585 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544926 4585 flags.go:64] FLAG: --memory-manager-policy="None" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544937 4585 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544949 4585 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544961 4585 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.544973 4585 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545089 4585 flags.go:64] FLAG: --node-status-max-images="50" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545103 4585 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545116 4585 flags.go:64] FLAG: --oom-score-adj="-999" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545127 4585 flags.go:64] FLAG: --pod-cidr="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545137 4585 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545154 4585 flags.go:64] FLAG: --pod-manifest-path="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545166 4585 flags.go:64] FLAG: --pod-max-pids="-1" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545178 4585 flags.go:64] FLAG: --pods-per-core="0" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545190 4585 flags.go:64] FLAG: --port="10250" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545202 4585 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545214 4585 flags.go:64] FLAG: --provider-id="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545227 4585 flags.go:64] FLAG: --qos-reserved="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545238 4585 flags.go:64] FLAG: --read-only-port="10255" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545250 4585 flags.go:64] FLAG: --register-node="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545259 4585 flags.go:64] FLAG: --register-schedulable="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545271 4585 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545292 4585 flags.go:64] FLAG: --registry-burst="10" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545304 4585 flags.go:64] FLAG: --registry-qps="5" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545315 4585 flags.go:64] FLAG: --reserved-cpus="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545327 4585 flags.go:64] FLAG: --reserved-memory="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545342 4585 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545354 4585 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545366 4585 flags.go:64] FLAG: --rotate-certificates="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545378 4585 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545390 4585 flags.go:64] FLAG: --runonce="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545402 4585 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545413 4585 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545426 4585 flags.go:64] FLAG: --seccomp-default="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545437 4585 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545449 4585 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545461 4585 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545472 4585 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545484 4585 flags.go:64] FLAG: --storage-driver-password="root" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545496 4585 flags.go:64] FLAG: --storage-driver-secure="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545508 4585 flags.go:64] FLAG: --storage-driver-table="stats" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545519 4585 flags.go:64] FLAG: --storage-driver-user="root" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545530 4585 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545543 4585 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545555 4585 flags.go:64] FLAG: --system-cgroups="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545566 4585 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545587 4585 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545630 4585 flags.go:64] FLAG: --tls-cert-file="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545642 4585 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545660 4585 flags.go:64] FLAG: --tls-min-version="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545671 4585 flags.go:64] FLAG: --tls-private-key-file="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545683 4585 flags.go:64] FLAG: --topology-manager-policy="none" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545696 4585 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545708 4585 flags.go:64] FLAG: --topology-manager-scope="container" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545720 4585 flags.go:64] FLAG: --v="2" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545735 4585 flags.go:64] FLAG: --version="false" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545749 4585 flags.go:64] FLAG: --vmodule="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545764 4585 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.545777 4585 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546015 4585 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546028 4585 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546041 4585 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546051 4585 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546060 4585 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546070 4585 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546084 4585 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546096 4585 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546105 4585 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546113 4585 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546123 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546131 4585 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546140 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546150 4585 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546161 4585 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546170 4585 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546180 4585 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546194 4585 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546207 4585 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546218 4585 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546228 4585 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546238 4585 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546248 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546258 4585 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546268 4585 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546283 4585 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546292 4585 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546300 4585 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546307 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546315 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546323 4585 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546330 4585 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546338 4585 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546345 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546355 4585 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546363 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546370 4585 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546378 4585 feature_gate.go:330] unrecognized feature gate: Example Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546385 4585 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546393 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546401 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546409 4585 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546417 4585 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546425 4585 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546433 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546440 4585 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546449 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546456 4585 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546467 4585 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546477 4585 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546485 4585 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546493 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546501 4585 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546509 4585 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546517 4585 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546525 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546533 4585 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546543 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546551 4585 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546558 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546567 4585 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546574 4585 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546582 4585 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546589 4585 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546624 4585 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546633 4585 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546640 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546649 4585 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546657 4585 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546665 4585 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.546674 4585 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.546699 4585 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.562361 4585 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.562407 4585 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562760 4585 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562782 4585 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562793 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562805 4585 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562817 4585 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562828 4585 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562840 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562850 4585 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562861 4585 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562872 4585 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562883 4585 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562893 4585 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562904 4585 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562917 4585 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562931 4585 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562942 4585 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562953 4585 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562963 4585 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562974 4585 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.562987 4585 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563003 4585 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563014 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563024 4585 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563033 4585 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563043 4585 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563053 4585 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563063 4585 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563074 4585 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563084 4585 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563095 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563106 4585 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563116 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563126 4585 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563138 4585 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563153 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563163 4585 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563173 4585 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563183 4585 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563193 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563203 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563213 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563223 4585 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563233 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563243 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563255 4585 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563268 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563279 4585 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563288 4585 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563298 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563309 4585 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563318 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563329 4585 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563342 4585 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563355 4585 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563367 4585 feature_gate.go:330] unrecognized feature gate: Example Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563377 4585 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563389 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563404 4585 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563414 4585 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563426 4585 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563436 4585 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563445 4585 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563455 4585 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563498 4585 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563508 4585 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563518 4585 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563528 4585 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563538 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563549 4585 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563558 4585 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563571 4585 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.563587 4585 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563915 4585 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563936 4585 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563947 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563957 4585 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563968 4585 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563979 4585 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563988 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.563998 4585 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564008 4585 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564021 4585 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564031 4585 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564042 4585 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564052 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564062 4585 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564073 4585 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564083 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564094 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564105 4585 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564114 4585 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564124 4585 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564135 4585 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564144 4585 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564154 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564164 4585 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564173 4585 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564184 4585 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564194 4585 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564204 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564213 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564223 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564232 4585 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564243 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564252 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564264 4585 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564275 4585 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564284 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564295 4585 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564304 4585 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564314 4585 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564323 4585 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564333 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564343 4585 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564354 4585 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564365 4585 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564374 4585 feature_gate.go:330] unrecognized feature gate: Example Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564384 4585 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564395 4585 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564405 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564419 4585 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564434 4585 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564445 4585 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564455 4585 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564467 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564478 4585 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564488 4585 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564499 4585 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564513 4585 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564526 4585 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564537 4585 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564548 4585 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564558 4585 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564568 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564578 4585 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564591 4585 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564640 4585 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564652 4585 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564663 4585 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564674 4585 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564684 4585 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564697 4585 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.564712 4585 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.564726 4585 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.565064 4585 server.go:940] "Client rotation is on, will bootstrap in background" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.572452 4585 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.572629 4585 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.574864 4585 server.go:997] "Starting client certificate rotation" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.574914 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.575172 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-15 19:26:23.802273195 +0000 UTC Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.575344 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.605330 4585 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.607160 4585 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.610256 4585 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.629171 4585 log.go:25] "Validated CRI v1 runtime API" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.673843 4585 log.go:25] "Validated CRI v1 image API" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.677283 4585 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.682527 4585 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-15-16-59-40-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.682583 4585 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.704140 4585 manager.go:217] Machine: {Timestamp:2026-02-15 17:05:44.701417763 +0000 UTC m=+0.644825975 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8fecb70f-9a43-454c-bc0f-3400703ceb5f BootID:ae821cdc-4077-4a14-ad50-91dcc5071f65 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:6d:a8:8d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:6d:a8:8d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d0:8c:24 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6f:20:8a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:1a:7c:25 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ba:87:f7 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1a:3d:60:da:2a:e9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:1c:4a:fa:2a:29 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.704554 4585 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.704872 4585 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.708269 4585 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.708591 4585 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.708712 4585 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.709100 4585 topology_manager.go:138] "Creating topology manager with none policy" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.709122 4585 container_manager_linux.go:303] "Creating device plugin manager" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.709684 4585 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.709736 4585 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.710065 4585 state_mem.go:36] "Initialized new in-memory state store" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.710192 4585 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.714201 4585 kubelet.go:418] "Attempting to sync node with API server" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.714270 4585 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.714342 4585 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.714371 4585 kubelet.go:324] "Adding apiserver pod source" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.714390 4585 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.721128 4585 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.722569 4585 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.724386 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.724552 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.724369 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.724660 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.725618 4585 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727506 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727551 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727567 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727584 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727636 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727650 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727663 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727684 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727700 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727715 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727733 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.727746 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.730204 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.731025 4585 server.go:1280] "Started kubelet" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.731201 4585 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.731887 4585 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.731971 4585 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 15 17:05:44 crc systemd[1]: Started Kubernetes Kubelet. Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.733479 4585 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.734501 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.734572 4585 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.734975 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 05:29:18.303988469 +0000 UTC Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.744456 4585 server.go:460] "Adding debug handlers to kubelet server" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.748338 4585 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.748377 4585 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.748747 4585 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.752651 4585 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.752688 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.752954 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.752177 4585 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.190:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18947a68f8b0e238 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-15 17:05:44.730976824 +0000 UTC m=+0.674384996,LastTimestamp:2026-02-15 17:05:44.730976824 +0000 UTC m=+0.674384996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.757187 4585 factory.go:55] Registering systemd factory Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.757238 4585 factory.go:221] Registration of the systemd container factory successfully Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.757731 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="200ms" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.760542 4585 factory.go:153] Registering CRI-O factory Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.760610 4585 factory.go:221] Registration of the crio container factory successfully Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.760718 4585 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.760750 4585 factory.go:103] Registering Raw factory Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.760776 4585 manager.go:1196] Started watching for new ooms in manager Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.763071 4585 manager.go:319] Starting recovery of all containers Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768137 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768258 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768286 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768306 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768326 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768348 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768369 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768393 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768420 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768441 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768462 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768487 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768508 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768536 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768558 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768580 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768635 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768659 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768681 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768701 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768764 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768800 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768831 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768860 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768921 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.768952 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769017 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769044 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769067 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769098 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769119 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769140 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769163 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769213 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769233 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769284 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769305 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769336 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769363 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769392 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769419 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769451 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769480 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769508 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769537 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769566 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769666 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769724 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769770 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769823 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769852 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769882 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769925 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769960 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.769990 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770033 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770064 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770093 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770123 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770153 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770181 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770210 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770307 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770331 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770392 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770424 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770446 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770468 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770502 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770532 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770560 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770585 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770655 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770679 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770732 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770755 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770783 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770909 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770938 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770960 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.770985 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771007 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771030 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771059 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771088 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771136 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771170 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771200 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771232 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771276 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771336 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771387 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771417 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771446 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771479 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771507 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771535 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771563 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771681 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771733 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771766 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771803 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771824 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771848 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771900 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771931 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771961 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.771994 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772017 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772052 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772083 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772120 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772141 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772170 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772192 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772221 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772244 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772265 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772286 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772309 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772331 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772382 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772423 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772450 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772470 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772497 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772523 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772547 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772655 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772703 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772738 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772758 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772777 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772798 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772818 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772841 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772863 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772889 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772915 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772940 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.772978 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773006 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773033 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773062 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773090 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773117 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773142 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773167 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773196 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773225 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773247 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773271 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773294 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773317 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773351 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773370 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773390 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773412 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773432 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773451 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773471 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773494 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773514 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773535 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773556 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773575 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773594 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773673 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773695 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773719 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773745 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773773 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773799 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773826 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773853 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773879 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773911 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773940 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773967 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.773992 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774021 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774047 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774073 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774104 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774131 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774156 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774180 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774200 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774220 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774261 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774281 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774299 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774318 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774338 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774359 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.774379 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.782193 4585 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.782273 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.782333 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783385 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783420 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783441 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783561 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783584 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783668 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783697 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783723 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783751 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783778 4585 reconstruct.go:97] "Volume reconstruction finished" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.783795 4585 reconciler.go:26] "Reconciler: start to sync state" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.795136 4585 manager.go:324] Recovery completed Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.808343 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.813710 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.813759 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.813771 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.815065 4585 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.815090 4585 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.815120 4585 state_mem.go:36] "Initialized new in-memory state store" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.835902 4585 policy_none.go:49] "None policy: Start" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.836871 4585 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.836899 4585 state_mem.go:35] "Initializing new in-memory state store" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.837875 4585 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.840293 4585 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.840337 4585 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.840370 4585 kubelet.go:2335] "Starting kubelet main sync loop" Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.840516 4585 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 15 17:05:44 crc kubenswrapper[4585]: W0215 17:05:44.844564 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.844677 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.852801 4585 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.896462 4585 manager.go:334] "Starting Device Plugin manager" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.896528 4585 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.896544 4585 server.go:79] "Starting device plugin registration server" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.897082 4585 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.897103 4585 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.897353 4585 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.897448 4585 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.897478 4585 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.903915 4585 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.941780 4585 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.941887 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.943543 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.943619 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.943641 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.943938 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.945162 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.945223 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.947716 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.947739 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.947785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.947745 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.947812 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.947843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.948040 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.948153 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.948180 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.948886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.948922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.948935 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.949319 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.949362 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.949383 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.949667 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.949816 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.949854 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.950403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.950423 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.950431 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.950549 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.950903 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.950931 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.951407 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.951429 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.951441 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.953070 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.953090 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.953098 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.953279 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.953301 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.953680 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.953704 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.953714 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.954488 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.954508 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.954515 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.958389 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="400ms" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991302 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991366 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991388 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991421 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991447 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991473 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991531 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991582 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991697 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991761 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.991796 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.997725 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.998808 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.998848 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.998859 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:44 crc kubenswrapper[4585]: I0215 17:05:44.998891 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 15 17:05:44 crc kubenswrapper[4585]: E0215 17:05:44.999344 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093011 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093078 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093110 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093140 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093237 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093287 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093344 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093378 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093400 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093409 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093419 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093439 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093440 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093465 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093484 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093500 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093520 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093524 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093544 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093547 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093566 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093590 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093618 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093647 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093679 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.093782 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.195114 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.195163 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.195315 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.195369 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.195428 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.195472 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.195470 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.195570 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.200343 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.202384 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.202443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.202463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.202505 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 15 17:05:45 crc kubenswrapper[4585]: E0215 17:05:45.203244 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.273138 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.291008 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.317125 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.323523 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.328032 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:05:45 crc kubenswrapper[4585]: E0215 17:05:45.359295 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="800ms" Feb 15 17:05:45 crc kubenswrapper[4585]: W0215 17:05:45.369025 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-12f38c425752a9ce1aee18a3ff587e56842c47677809c9fd696d868acbda442f WatchSource:0}: Error finding container 12f38c425752a9ce1aee18a3ff587e56842c47677809c9fd696d868acbda442f: Status 404 returned error can't find the container with id 12f38c425752a9ce1aee18a3ff587e56842c47677809c9fd696d868acbda442f Feb 15 17:05:45 crc kubenswrapper[4585]: W0215 17:05:45.371103 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f160952165a19f28ffdaf649be73bc6dfc17f8f0410692c49f4dcb1d910fcf73 WatchSource:0}: Error finding container f160952165a19f28ffdaf649be73bc6dfc17f8f0410692c49f4dcb1d910fcf73: Status 404 returned error can't find the container with id f160952165a19f28ffdaf649be73bc6dfc17f8f0410692c49f4dcb1d910fcf73 Feb 15 17:05:45 crc kubenswrapper[4585]: W0215 17:05:45.373026 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-cbc570758117195da47f9ee22d84edbd13156be59ea5380e694825c183f64e5d WatchSource:0}: Error finding container cbc570758117195da47f9ee22d84edbd13156be59ea5380e694825c183f64e5d: Status 404 returned error can't find the container with id cbc570758117195da47f9ee22d84edbd13156be59ea5380e694825c183f64e5d Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.603891 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.605960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.606057 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.606085 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.606143 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 15 17:05:45 crc kubenswrapper[4585]: E0215 17:05:45.608556 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Feb 15 17:05:45 crc kubenswrapper[4585]: W0215 17:05:45.690397 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:45 crc kubenswrapper[4585]: E0215 17:05:45.690573 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.735238 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 10:31:52.477699876 +0000 UTC Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.735322 4585 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.857228 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"28a1f2807cdc9ba9709dc8fd1a81221cafc5e8d608e11dfd8a804f6adba15ebe"} Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.858918 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f160952165a19f28ffdaf649be73bc6dfc17f8f0410692c49f4dcb1d910fcf73"} Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.860210 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"12f38c425752a9ce1aee18a3ff587e56842c47677809c9fd696d868acbda442f"} Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.861851 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cbc570758117195da47f9ee22d84edbd13156be59ea5380e694825c183f64e5d"} Feb 15 17:05:45 crc kubenswrapper[4585]: I0215 17:05:45.864717 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"427ccfb6cdc26fc66b2f4f248b84a5447257ba2a3d777a7d85d0c289271ff3e1"} Feb 15 17:05:46 crc kubenswrapper[4585]: W0215 17:05:46.002621 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:46 crc kubenswrapper[4585]: E0215 17:05:46.003759 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:46 crc kubenswrapper[4585]: W0215 17:05:46.025037 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:46 crc kubenswrapper[4585]: E0215 17:05:46.025504 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:46 crc kubenswrapper[4585]: E0215 17:05:46.159964 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="1.6s" Feb 15 17:05:46 crc kubenswrapper[4585]: W0215 17:05:46.321812 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:46 crc kubenswrapper[4585]: E0215 17:05:46.321918 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.408786 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.411332 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.411405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.411426 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.411531 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 15 17:05:46 crc kubenswrapper[4585]: E0215 17:05:46.412123 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.690002 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 15 17:05:46 crc kubenswrapper[4585]: E0215 17:05:46.691726 4585 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.736542 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 04:48:28.89918611 +0000 UTC Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.737761 4585 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.868848 4585 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0" exitCode=0 Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.868910 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0"} Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.869009 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.870161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.870183 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.870193 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.872893 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84"} Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.872934 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e"} Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.872946 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4"} Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.874675 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789" exitCode=0 Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.874725 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789"} Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.874819 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.875438 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.875460 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.875469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.876843 4585 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16" exitCode=0 Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.876897 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16"} Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.876985 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.877454 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.877922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.877948 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.877958 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.878517 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.878553 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.878563 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.882757 4585 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="84c4d470a126536489835fd48c7b32e1f0e14f2df635f678e4e3d3175666c265" exitCode=0 Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.882891 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.883016 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"84c4d470a126536489835fd48c7b32e1f0e14f2df635f678e4e3d3175666c265"} Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.884168 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.884200 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:46 crc kubenswrapper[4585]: I0215 17:05:46.884214 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.734979 4585 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.737032 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 07:05:36.81049338 +0000 UTC Feb 15 17:05:47 crc kubenswrapper[4585]: E0215 17:05:47.761534 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="3.2s" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.893223 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.893269 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.893279 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.893375 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.895886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.895924 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.895935 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.899106 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.899230 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.900045 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.900086 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.900101 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.902444 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.902477 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.902491 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.902503 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.904893 4585 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2" exitCode=0 Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.904948 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.905068 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.905950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.905979 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.905990 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.908681 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4880fbe29ae3839309a0c788dbf02d73869757aa52899c34e89cc857e8f4c17d"} Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.908767 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.909454 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.909474 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:47 crc kubenswrapper[4585]: I0215 17:05:47.909484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:47 crc kubenswrapper[4585]: W0215 17:05:47.950271 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:47 crc kubenswrapper[4585]: E0215 17:05:47.950349 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.013159 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.014197 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.014245 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.014260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.014287 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 15 17:05:48 crc kubenswrapper[4585]: E0215 17:05:48.014842 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.190:6443: connect: connection refused" node="crc" Feb 15 17:05:48 crc kubenswrapper[4585]: W0215 17:05:48.149885 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:05:48 crc kubenswrapper[4585]: E0215 17:05:48.150000 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.190:6443: connect: connection refused" logger="UnhandledError" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.737470 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 22:44:23.405043314 +0000 UTC Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.915984 4585 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c" exitCode=0 Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.916064 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c"} Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.916176 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.917937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.918032 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.918065 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.923800 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.923931 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.923939 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.923802 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645"} Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.924539 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.924634 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.925496 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.925539 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.925569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.925545 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.925699 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.925590 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.925926 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.925950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.925969 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.931904 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.931948 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:48 crc kubenswrapper[4585]: I0215 17:05:48.931962 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.737624 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 07:54:16.767609792 +0000 UTC Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.931652 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa"} Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.931709 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7"} Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.931721 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.931728 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6"} Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.931797 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.931879 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.934040 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.934098 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.934115 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.934170 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.934196 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:49 crc kubenswrapper[4585]: I0215 17:05:49.934210 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.197333 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.738012 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 15:42:29.520979116 +0000 UTC Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.890113 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.940813 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.940835 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7"} Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.940877 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.940910 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f"} Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.940916 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.942341 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.942398 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.942422 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.942349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.942479 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:50 crc kubenswrapper[4585]: I0215 17:05:50.942504 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.215725 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.217554 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.217642 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.217668 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.217708 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.738790 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 16:47:49.772891321 +0000 UTC Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.867106 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.943504 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.943505 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.945702 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.945761 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.945774 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.945765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.945888 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:51 crc kubenswrapper[4585]: I0215 17:05:51.945900 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.272168 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.272341 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.274065 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.274126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.274151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.739585 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 10:59:40.209746983 +0000 UTC Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.776173 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.946379 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.948633 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.948702 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:52 crc kubenswrapper[4585]: I0215 17:05:52.948728 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:53 crc kubenswrapper[4585]: I0215 17:05:53.658226 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:05:53 crc kubenswrapper[4585]: I0215 17:05:53.658859 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:53 crc kubenswrapper[4585]: I0215 17:05:53.660469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:53 crc kubenswrapper[4585]: I0215 17:05:53.660531 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:53 crc kubenswrapper[4585]: I0215 17:05:53.660548 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:53 crc kubenswrapper[4585]: I0215 17:05:53.740768 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 22:08:36.477359072 +0000 UTC Feb 15 17:05:54 crc kubenswrapper[4585]: I0215 17:05:54.741938 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 02:40:28.839839267 +0000 UTC Feb 15 17:05:54 crc kubenswrapper[4585]: E0215 17:05:54.904067 4585 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.295004 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.295255 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.296813 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.296849 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.296862 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.305235 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.305402 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.306690 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.306716 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.306728 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.593524 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.601258 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.742554 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 22:05:20.736117331 +0000 UTC Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.873476 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.955336 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.957005 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.957050 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.957067 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:55 crc kubenswrapper[4585]: I0215 17:05:55.962136 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:05:56 crc kubenswrapper[4585]: I0215 17:05:56.743017 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 14:23:20.087618106 +0000 UTC Feb 15 17:05:56 crc kubenswrapper[4585]: I0215 17:05:56.958364 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:56 crc kubenswrapper[4585]: I0215 17:05:56.959779 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:56 crc kubenswrapper[4585]: I0215 17:05:56.959813 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:56 crc kubenswrapper[4585]: I0215 17:05:56.959824 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:57 crc kubenswrapper[4585]: I0215 17:05:57.743992 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 20:45:52.263561442 +0000 UTC Feb 15 17:05:57 crc kubenswrapper[4585]: I0215 17:05:57.961413 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:57 crc kubenswrapper[4585]: I0215 17:05:57.962910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:57 crc kubenswrapper[4585]: I0215 17:05:57.963119 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:57 crc kubenswrapper[4585]: I0215 17:05:57.963259 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.552139 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.552212 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 15 17:05:58 crc kubenswrapper[4585]: W0215 17:05:58.727938 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.728042 4585 trace.go:236] Trace[965838568]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (15-Feb-2026 17:05:48.726) (total time: 10001ms): Feb 15 17:05:58 crc kubenswrapper[4585]: Trace[965838568]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (17:05:58.727) Feb 15 17:05:58 crc kubenswrapper[4585]: Trace[965838568]: [10.001059276s] [10.001059276s] END Feb 15 17:05:58 crc kubenswrapper[4585]: E0215 17:05:58.728066 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.734736 4585 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.744701 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 01:26:50.540309202 +0000 UTC Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.874087 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.874224 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 15 17:05:58 crc kubenswrapper[4585]: W0215 17:05:58.905731 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.905835 4585 trace.go:236] Trace[1941445459]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (15-Feb-2026 17:05:48.903) (total time: 10001ms): Feb 15 17:05:58 crc kubenswrapper[4585]: Trace[1941445459]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:05:58.905) Feb 15 17:05:58 crc kubenswrapper[4585]: Trace[1941445459]: [10.001934359s] [10.001934359s] END Feb 15 17:05:58 crc kubenswrapper[4585]: E0215 17:05:58.905869 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.965895 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.968787 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645" exitCode=255 Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.968842 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645"} Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.969049 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.970222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.970339 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.970433 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:05:58 crc kubenswrapper[4585]: I0215 17:05:58.971137 4585 scope.go:117] "RemoveContainer" containerID="bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645" Feb 15 17:05:59 crc kubenswrapper[4585]: E0215 17:05:59.070816 4585 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.18947a68f8b0e238 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-15 17:05:44.730976824 +0000 UTC m=+0.674384996,LastTimestamp:2026-02-15 17:05:44.730976824 +0000 UTC m=+0.674384996,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.563176 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.563449 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.567942 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.568024 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.745452 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 22:28:21.659094497 +0000 UTC Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.974801 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.976938 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4"} Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.977131 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.978629 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.978662 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:05:59 crc kubenswrapper[4585]: I0215 17:05:59.978674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:00 crc kubenswrapper[4585]: I0215 17:06:00.211845 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]log ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]etcd ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/generic-apiserver-start-informers ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/priority-and-fairness-filter ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-apiextensions-informers ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-apiextensions-controllers ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/crd-informer-synced ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-system-namespaces-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 15 17:06:00 crc kubenswrapper[4585]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 15 17:06:00 crc kubenswrapper[4585]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/bootstrap-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/start-kube-aggregator-informers ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/apiservice-registration-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/apiservice-discovery-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]autoregister-completion ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/apiservice-openapi-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 15 17:06:00 crc kubenswrapper[4585]: livez check failed Feb 15 17:06:00 crc kubenswrapper[4585]: I0215 17:06:00.211944 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:06:00 crc kubenswrapper[4585]: I0215 17:06:00.745656 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 00:53:44.638961861 +0000 UTC Feb 15 17:06:01 crc kubenswrapper[4585]: I0215 17:06:01.746181 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 18:23:54.108381199 +0000 UTC Feb 15 17:06:01 crc kubenswrapper[4585]: I0215 17:06:01.867981 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:06:01 crc kubenswrapper[4585]: I0215 17:06:01.868352 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:06:01 crc kubenswrapper[4585]: I0215 17:06:01.869775 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:01 crc kubenswrapper[4585]: I0215 17:06:01.870002 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:01 crc kubenswrapper[4585]: I0215 17:06:01.870142 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:02 crc kubenswrapper[4585]: I0215 17:06:02.746981 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:12:40.842140686 +0000 UTC Feb 15 17:06:02 crc kubenswrapper[4585]: I0215 17:06:02.817245 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 15 17:06:02 crc kubenswrapper[4585]: I0215 17:06:02.818056 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:06:02 crc kubenswrapper[4585]: I0215 17:06:02.820540 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:02 crc kubenswrapper[4585]: I0215 17:06:02.820633 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:02 crc kubenswrapper[4585]: I0215 17:06:02.820653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:02 crc kubenswrapper[4585]: I0215 17:06:02.841385 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 15 17:06:02 crc kubenswrapper[4585]: I0215 17:06:02.913132 4585 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.562159 4585 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.724203 4585 apiserver.go:52] "Watching apiserver" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.729619 4585 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.730111 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.730557 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.730809 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:03 crc kubenswrapper[4585]: E0215 17:06:03.730936 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.730970 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.731146 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.731385 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:03 crc kubenswrapper[4585]: E0215 17:06:03.731457 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.731399 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:03 crc kubenswrapper[4585]: E0215 17:06:03.731533 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.734363 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.734542 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.734382 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.734932 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.734967 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.735441 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.735855 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.736119 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.743976 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.748036 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 08:37:08.121230539 +0000 UTC Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.749355 4585 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.777017 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.791694 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.804483 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.815301 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.825514 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.847189 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.858803 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:03 crc kubenswrapper[4585]: I0215 17:06:03.872409 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.565712 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.568733 4585 trace.go:236] Trace[1351798240]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (15-Feb-2026 17:05:52.443) (total time: 12124ms): Feb 15 17:06:04 crc kubenswrapper[4585]: Trace[1351798240]: ---"Objects listed" error: 12124ms (17:06:04.568) Feb 15 17:06:04 crc kubenswrapper[4585]: Trace[1351798240]: [12.124802297s] [12.124802297s] END Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.568969 4585 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.568880 4585 trace.go:236] Trace[1828810101]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (15-Feb-2026 17:05:54.182) (total time: 10385ms): Feb 15 17:06:04 crc kubenswrapper[4585]: Trace[1828810101]: ---"Objects listed" error: 10385ms (17:06:04.568) Feb 15 17:06:04 crc kubenswrapper[4585]: Trace[1828810101]: [10.385993304s] [10.385993304s] END Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.569121 4585 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.574156 4585 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.575664 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.578737 4585 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.613904 4585 csr.go:261] certificate signing request csr-6g7rr is approved, waiting to be issued Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.627032 4585 csr.go:257] certificate signing request csr-6g7rr is issued Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675206 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675258 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675284 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675306 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675322 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675337 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675353 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675372 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675397 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675424 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675441 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675460 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675505 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675523 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675539 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675558 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675571 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675623 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675776 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675836 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675864 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675881 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.675902 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676452 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676440 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676477 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676683 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676705 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676753 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676882 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676947 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676983 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.676990 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677069 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677183 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677260 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677313 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677392 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677427 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677487 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677518 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677528 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677564 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677673 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677731 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677555 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677785 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677808 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677826 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677829 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677847 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677862 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677867 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677907 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677927 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677948 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677967 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.677983 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678000 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678015 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678019 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678030 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678048 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678068 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678087 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678107 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678122 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678139 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678154 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678222 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678242 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678264 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678279 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678297 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678317 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678333 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678348 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678367 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678386 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678404 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678420 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678438 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678454 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678471 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678485 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678502 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678517 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678534 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678551 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678566 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678583 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678632 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678649 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678666 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678682 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678697 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678713 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678728 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678743 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678758 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678774 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678790 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678806 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678824 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678863 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678879 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678898 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678919 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678936 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678956 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679050 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679069 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679087 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679105 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679121 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679136 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679156 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679173 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679190 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679206 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679225 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679243 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679261 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679275 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679290 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679306 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679321 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679335 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679354 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679369 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679386 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679402 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679418 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679434 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679449 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679465 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679506 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679522 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679540 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679558 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679576 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679591 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679628 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679647 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679664 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679682 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679698 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679714 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679731 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679747 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679763 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679779 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679793 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679810 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679828 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679843 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679860 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679878 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679894 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679910 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679926 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679941 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679957 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679972 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679988 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680004 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680020 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680035 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680051 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680066 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680082 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680098 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680116 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680132 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680147 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680166 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680181 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680197 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680212 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680227 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680242 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680257 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680276 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680293 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680313 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680329 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680345 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680360 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680376 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680393 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680409 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680425 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680442 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680458 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680475 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680491 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680507 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680523 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680539 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680556 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680571 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680587 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680619 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680634 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680652 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680670 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680684 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680701 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680716 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680732 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680747 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680763 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680779 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680795 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680812 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680829 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680845 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680862 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680878 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680922 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680944 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680969 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680992 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681011 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681027 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681045 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681062 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681287 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681307 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681323 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681340 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678150 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678180 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678285 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678370 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678532 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678638 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678692 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678801 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678895 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678952 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.678988 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679060 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679204 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679214 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679367 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679398 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679519 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.679626 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.682746 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.680940 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681331 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681362 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681508 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681695 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.682805 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.681917 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.682292 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.682404 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.682543 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.682570 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.682986 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683066 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683093 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683122 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683140 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683162 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683177 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683188 4585 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683201 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683211 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683221 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683231 4585 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683242 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683254 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683263 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683272 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683284 4585 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683294 4585 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683303 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683312 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683321 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683331 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683341 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683351 4585 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683360 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683369 4585 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683379 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683388 4585 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683398 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683271 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683416 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683398 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.683787 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.684106 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.684160 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.684166 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.684465 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.684639 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.684677 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.684744 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.684867 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.684988 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.685156 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.685246 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.685375 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.685387 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.685419 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.685708 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.685835 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.686999 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.687038 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.687415 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.687636 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.688043 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.688365 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.689004 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.689073 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.689586 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.689784 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.690390 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.691476 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.691899 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.691129 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.692278 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.692584 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.692621 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.692828 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.692949 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.693008 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.693383 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.693841 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.694019 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.694442 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.695055 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.695199 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.695326 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.695619 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.695716 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.695853 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.695865 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.696076 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.696166 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.696211 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.696370 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:06:05.196345263 +0000 UTC m=+21.139753395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.697658 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.697900 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.697962 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.698200 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.698455 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.701520 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.701560 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.701965 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.702759 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.702877 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.703166 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.703486 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.703731 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.703746 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.704236 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.704909 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.705836 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.706418 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.706656 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.706706 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.706787 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.707039 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.707042 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.707301 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.707444 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.707567 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.707690 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.707718 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:05.207690874 +0000 UTC m=+21.151099006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.707987 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.708211 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.709426 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.709497 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.709566 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.710754 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.712186 4585 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.713143 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.728302 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.728422 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:05.228393036 +0000 UTC m=+21.171801388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.728982 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.729430 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.729671 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.731084 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.731933 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.732551 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.732927 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.733146 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.733255 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.733325 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.733493 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:05.233472422 +0000 UTC m=+21.176880554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.734617 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.734835 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.735005 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.735972 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.736071 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.736093 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.736170 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.736321 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.736360 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.736545 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.736830 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.737225 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.738046 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.738352 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.739911 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.740163 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.743633 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.744154 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.744294 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.745271 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.748312 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.748372 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:09:13.259629857 +0000 UTC Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.751487 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.759273 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.759333 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:04 crc kubenswrapper[4585]: E0215 17:06:04.759465 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:05.259446025 +0000 UTC m=+21.202854157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.758166 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.769733 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.779109 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wvfh6"] Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.779787 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wvfh6" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.780587 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7vtnf"] Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.780954 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.781999 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.785659 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.786039 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.786252 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.786806 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.787172 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.787443 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.787695 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.788047 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.789556 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.789767 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.789999 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.790076 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.790138 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.790197 4585 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.790255 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.790312 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.790378 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.790458 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.790524 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.789885 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791206 4585 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791313 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791392 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791466 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791532 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791613 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791670 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791733 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791791 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791852 4585 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791905 4585 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.791966 4585 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792031 4585 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792089 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792162 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792233 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792298 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792373 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792449 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792511 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792571 4585 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792780 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.792882 4585 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.795594 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.795725 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.795845 4585 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.795928 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.795986 4585 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.796071 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.796162 4585 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.796252 4585 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.796369 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.796458 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.796790 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.796887 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.796946 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797019 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797086 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797210 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797281 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797356 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797437 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797511 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797617 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797706 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797773 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797829 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797886 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797939 4585 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.797999 4585 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.798065 4585 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.799135 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.799256 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.799360 4585 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.799488 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.789921 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.799747 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.799823 4585 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.799880 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.799944 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800127 4585 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800211 4585 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800312 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800390 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800449 4585 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800507 4585 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800717 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800777 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800830 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800886 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.800943 4585 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801005 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801062 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801115 4585 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801206 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801264 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801320 4585 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801387 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801449 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801506 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801564 4585 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801755 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801825 4585 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801899 4585 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.801978 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.802044 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.802240 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.802314 4585 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.802384 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.802451 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.802505 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.802557 4585 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.803017 4585 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.803110 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.803180 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804164 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804250 4585 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804312 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804390 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804449 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804509 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804569 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804647 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804786 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.811861 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.811937 4585 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.812000 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.812073 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.812129 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.812184 4585 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.812249 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.802911 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.812320 4585 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814742 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814758 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814769 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814795 4585 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814807 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814817 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814828 4585 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814838 4585 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814852 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814865 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814877 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814891 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814902 4585 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814914 4585 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814926 4585 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814945 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814957 4585 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814971 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814984 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.814995 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.815005 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.815016 4585 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.815026 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.815037 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.806165 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.802986 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.803321 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.803535 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.803625 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.803637 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.803659 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804070 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.804228 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.806009 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.811670 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.813241 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.813307 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.813380 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.821946 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.821994 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.822564 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.822732 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.823531 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.823790 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.824109 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.826880 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.827328 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.827405 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.827805 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.841091 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.854324 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.854925 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.856537 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.857254 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.857353 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.858292 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.858792 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.859356 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.860973 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.864259 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.864971 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.865926 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.866443 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.869921 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.870580 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.871113 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.874812 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.875330 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.876320 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.876713 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.877278 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.881279 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.882049 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.883235 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.883760 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.884885 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.885332 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.885963 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.887881 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.889205 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.889713 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.890762 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.891276 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.892192 4585 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.892300 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.894108 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.895171 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.895775 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.897409 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.898076 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.898977 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.899590 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.900753 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.901324 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.902496 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.903532 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.906807 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.907377 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.908318 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.909321 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.909350 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.910440 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.911395 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.912584 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.913069 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.914480 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.915774 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9gqm\" (UniqueName: \"kubernetes.io/projected/ff52deac-c128-47a3-b1fe-15ee558b62b4-kube-api-access-x9gqm\") pod \"node-resolver-wvfh6\" (UID: \"ff52deac-c128-47a3-b1fe-15ee558b62b4\") " pod="openshift-dns/node-resolver-wvfh6" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.915828 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/41851c0a-0f98-4a53-b102-505ee4f6b1ea-serviceca\") pod \"node-ca-7vtnf\" (UID: \"41851c0a-0f98-4a53-b102-505ee4f6b1ea\") " pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.915880 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41851c0a-0f98-4a53-b102-505ee4f6b1ea-host\") pod \"node-ca-7vtnf\" (UID: \"41851c0a-0f98-4a53-b102-505ee4f6b1ea\") " pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.915901 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ff52deac-c128-47a3-b1fe-15ee558b62b4-hosts-file\") pod \"node-resolver-wvfh6\" (UID: \"ff52deac-c128-47a3-b1fe-15ee558b62b4\") " pod="openshift-dns/node-resolver-wvfh6" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.915931 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52s78\" (UniqueName: \"kubernetes.io/projected/41851c0a-0f98-4a53-b102-505ee4f6b1ea-kube-api-access-52s78\") pod \"node-ca-7vtnf\" (UID: \"41851c0a-0f98-4a53-b102-505ee4f6b1ea\") " pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.915961 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.915974 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.915985 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.915998 4585 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916007 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916018 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916030 4585 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916042 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916052 4585 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916064 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916074 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916084 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916094 4585 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916104 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916115 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916125 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916135 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916145 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916155 4585 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916165 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916174 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916185 4585 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916194 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916203 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916213 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916222 4585 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.916877 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.917362 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.947178 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.955825 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.959174 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.961014 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 15 17:06:04 crc kubenswrapper[4585]: W0215 17:06:04.965851 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-fefc6f63ed0ce155630a145084e77a0580b2b5c4899751beebab597ad3b4d93c WatchSource:0}: Error finding container fefc6f63ed0ce155630a145084e77a0580b2b5c4899751beebab597ad3b4d93c: Status 404 returned error can't find the container with id fefc6f63ed0ce155630a145084e77a0580b2b5c4899751beebab597ad3b4d93c Feb 15 17:06:04 crc kubenswrapper[4585]: I0215 17:06:04.998102 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.010033 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f0c10dcef128e9e0dae6ace20fe7c3eaa93c0eb427bbdae9d8926766847451da"} Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.011266 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fefc6f63ed0ce155630a145084e77a0580b2b5c4899751beebab597ad3b4d93c"} Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.016524 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7546573647f77f7a359068c78f54a30aa1178ec2c902d5f02a609d10f036ee85"} Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.018974 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9gqm\" (UniqueName: \"kubernetes.io/projected/ff52deac-c128-47a3-b1fe-15ee558b62b4-kube-api-access-x9gqm\") pod \"node-resolver-wvfh6\" (UID: \"ff52deac-c128-47a3-b1fe-15ee558b62b4\") " pod="openshift-dns/node-resolver-wvfh6" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.019020 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/41851c0a-0f98-4a53-b102-505ee4f6b1ea-serviceca\") pod \"node-ca-7vtnf\" (UID: \"41851c0a-0f98-4a53-b102-505ee4f6b1ea\") " pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.019055 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41851c0a-0f98-4a53-b102-505ee4f6b1ea-host\") pod \"node-ca-7vtnf\" (UID: \"41851c0a-0f98-4a53-b102-505ee4f6b1ea\") " pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.019073 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ff52deac-c128-47a3-b1fe-15ee558b62b4-hosts-file\") pod \"node-resolver-wvfh6\" (UID: \"ff52deac-c128-47a3-b1fe-15ee558b62b4\") " pod="openshift-dns/node-resolver-wvfh6" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.019100 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52s78\" (UniqueName: \"kubernetes.io/projected/41851c0a-0f98-4a53-b102-505ee4f6b1ea-kube-api-access-52s78\") pod \"node-ca-7vtnf\" (UID: \"41851c0a-0f98-4a53-b102-505ee4f6b1ea\") " pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.019780 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41851c0a-0f98-4a53-b102-505ee4f6b1ea-host\") pod \"node-ca-7vtnf\" (UID: \"41851c0a-0f98-4a53-b102-505ee4f6b1ea\") " pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.019833 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ff52deac-c128-47a3-b1fe-15ee558b62b4-hosts-file\") pod \"node-resolver-wvfh6\" (UID: \"ff52deac-c128-47a3-b1fe-15ee558b62b4\") " pod="openshift-dns/node-resolver-wvfh6" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.023440 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/41851c0a-0f98-4a53-b102-505ee4f6b1ea-serviceca\") pod \"node-ca-7vtnf\" (UID: \"41851c0a-0f98-4a53-b102-505ee4f6b1ea\") " pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.023951 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.056241 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9gqm\" (UniqueName: \"kubernetes.io/projected/ff52deac-c128-47a3-b1fe-15ee558b62b4-kube-api-access-x9gqm\") pod \"node-resolver-wvfh6\" (UID: \"ff52deac-c128-47a3-b1fe-15ee558b62b4\") " pod="openshift-dns/node-resolver-wvfh6" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.060983 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.078571 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52s78\" (UniqueName: \"kubernetes.io/projected/41851c0a-0f98-4a53-b102-505ee4f6b1ea-kube-api-access-52s78\") pod \"node-ca-7vtnf\" (UID: \"41851c0a-0f98-4a53-b102-505ee4f6b1ea\") " pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.097939 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.102764 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wvfh6" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.107707 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7vtnf" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.126087 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.149229 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: W0215 17:06:05.150042 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41851c0a_0f98_4a53_b102_505ee4f6b1ea.slice/crio-ead0fe145d6bd517145706346cba1a5efdbef5632ed0611d6fbd032be823a620 WatchSource:0}: Error finding container ead0fe145d6bd517145706346cba1a5efdbef5632ed0611d6fbd032be823a620: Status 404 returned error can't find the container with id ead0fe145d6bd517145706346cba1a5efdbef5632ed0611d6fbd032be823a620 Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.182482 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4hptv"] Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.183138 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.193452 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.193975 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.194411 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.194585 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.194746 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.194911 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.209199 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.221863 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.222867 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.227174 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.227274 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.227388 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.227439 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:06.227423099 +0000 UTC m=+22.170831231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.227492 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:06:06.227485741 +0000 UTC m=+22.170893873 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.237875 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.266331 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.292255 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.310800 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.310884 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.328953 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.329005 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.329030 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.329052 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c41aeb2-e722-4379-b7d6-fe499719f9d2-proxy-tls\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.329088 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcsmn\" (UniqueName: \"kubernetes.io/projected/0c41aeb2-e722-4379-b7d6-fe499719f9d2-kube-api-access-fcsmn\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.329110 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0c41aeb2-e722-4379-b7d6-fe499719f9d2-rootfs\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.329131 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c41aeb2-e722-4379-b7d6-fe499719f9d2-mcd-auth-proxy-config\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329283 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329313 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329325 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329360 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:06.329347483 +0000 UTC m=+22.272755615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329423 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329433 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329441 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329481 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:06.329472806 +0000 UTC m=+22.272880938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329542 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.329562 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:06.329557278 +0000 UTC m=+22.272965410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.361856 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.378854 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.396357 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.412414 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.424733 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.429892 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcsmn\" (UniqueName: \"kubernetes.io/projected/0c41aeb2-e722-4379-b7d6-fe499719f9d2-kube-api-access-fcsmn\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.429962 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0c41aeb2-e722-4379-b7d6-fe499719f9d2-rootfs\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.429985 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c41aeb2-e722-4379-b7d6-fe499719f9d2-mcd-auth-proxy-config\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.430027 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c41aeb2-e722-4379-b7d6-fe499719f9d2-proxy-tls\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.430191 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0c41aeb2-e722-4379-b7d6-fe499719f9d2-rootfs\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.430801 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0c41aeb2-e722-4379-b7d6-fe499719f9d2-mcd-auth-proxy-config\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.435824 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.437866 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c41aeb2-e722-4379-b7d6-fe499719f9d2-proxy-tls\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.448905 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.464140 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcsmn\" (UniqueName: \"kubernetes.io/projected/0c41aeb2-e722-4379-b7d6-fe499719f9d2-kube-api-access-fcsmn\") pod \"machine-config-daemon-4hptv\" (UID: \"0c41aeb2-e722-4379-b7d6-fe499719f9d2\") " pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.470484 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.502906 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.519372 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.536844 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.552031 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.580833 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.596134 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.619729 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-n4ps2"] Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.620244 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.621082 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-bwj9b"] Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.621751 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vp6tl"] Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.621966 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.622646 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.625207 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.625320 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.625415 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.625527 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.625570 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.628752 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-15 17:01:04 +0000 UTC, rotation deadline is 2026-11-30 13:41:54.541223097 +0000 UTC Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.628821 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6908h35m48.912404011s for next certificate rotation Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.629302 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.629355 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.629386 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.630018 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.633182 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.633416 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.633670 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.634294 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.634698 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.635121 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.656038 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.671622 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.684376 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.707520 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:05:58Z\\\",\\\"message\\\":\\\"W0215 17:05:48.038471 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0215 17:05:48.038918 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771175148 cert, and key in /tmp/serving-cert-3963139930/serving-signer.crt, /tmp/serving-cert-3963139930/serving-signer.key\\\\nI0215 17:05:48.202726 1 observer_polling.go:159] Starting file observer\\\\nW0215 17:05:48.207713 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0215 17:05:48.207968 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0215 17:05:48.209111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3963139930/tls.crt::/tmp/serving-cert-3963139930/tls.key\\\\\\\"\\\\nF0215 17:05:58.454273 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.722530 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732303 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-os-release\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732340 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-config\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732365 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-var-lib-cni-multus\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732387 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5acdc04-0978-4907-bd9e-965400ded9bf-ovn-node-metrics-cert\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732409 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-run-netns\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732428 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r966t\" (UniqueName: \"kubernetes.io/projected/70645395-8d49-4495-a647-b6d43206ecbc-kube-api-access-r966t\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732445 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-log-socket\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732464 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsp49\" (UniqueName: \"kubernetes.io/projected/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-kube-api-access-vsp49\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732489 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-cnibin\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732505 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-bin\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732528 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-ovn\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732543 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-netns\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732623 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-os-release\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732643 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-multus-cni-dir\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732660 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-kubelet\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732694 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732716 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70645395-8d49-4495-a647-b6d43206ecbc-multus-daemon-config\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732733 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-etc-openvswitch\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732750 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70645395-8d49-4495-a647-b6d43206ecbc-cni-binary-copy\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732767 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-hostroot\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732788 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h74xg\" (UniqueName: \"kubernetes.io/projected/e5acdc04-0978-4907-bd9e-965400ded9bf-kube-api-access-h74xg\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732813 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-system-cni-dir\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732832 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-multus-conf-dir\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732854 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-ovn-kubernetes\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732873 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-script-lib\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732890 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-cnibin\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732907 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-multus-socket-dir-parent\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732925 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-slash\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732940 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-node-log\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732965 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-etc-kubernetes\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732980 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-systemd\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.732995 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-var-lib-openvswitch\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733015 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733044 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-run-k8s-cni-cncf-io\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733060 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-cni-binary-copy\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733075 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733097 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-systemd-units\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733112 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-openvswitch\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733126 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-netd\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733142 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-env-overrides\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733158 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-system-cni-dir\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733186 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-var-lib-cni-bin\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733203 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-var-lib-kubelet\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.733230 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-run-multus-certs\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.734978 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.744467 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.759417 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 04:33:02.619610338 +0000 UTC Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.763125 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.786553 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.798901 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.813586 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.822676 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834505 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-script-lib\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834546 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h74xg\" (UniqueName: \"kubernetes.io/projected/e5acdc04-0978-4907-bd9e-965400ded9bf-kube-api-access-h74xg\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834567 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-system-cni-dir\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834629 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-multus-conf-dir\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834648 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-ovn-kubernetes\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834668 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-slash\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834685 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-cnibin\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834703 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-multus-socket-dir-parent\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834723 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-node-log\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834744 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-etc-kubernetes\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834760 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-systemd\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834775 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-var-lib-openvswitch\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834792 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834817 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834833 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-run-k8s-cni-cncf-io\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834848 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-cni-binary-copy\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834870 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-systemd-units\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834885 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-run-multus-certs\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834902 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-openvswitch\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834919 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-netd\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834935 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-env-overrides\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834951 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-system-cni-dir\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834965 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-var-lib-cni-bin\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834981 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-var-lib-kubelet\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.834996 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-os-release\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835009 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-config\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835027 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-run-netns\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835045 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-var-lib-cni-multus\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835062 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5acdc04-0978-4907-bd9e-965400ded9bf-ovn-node-metrics-cert\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835077 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-cnibin\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835091 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r966t\" (UniqueName: \"kubernetes.io/projected/70645395-8d49-4495-a647-b6d43206ecbc-kube-api-access-r966t\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835106 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-log-socket\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835125 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsp49\" (UniqueName: \"kubernetes.io/projected/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-kube-api-access-vsp49\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835140 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-bin\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835156 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-ovn\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835174 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-multus-cni-dir\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835191 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-netns\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835212 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-os-release\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835226 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-kubelet\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835242 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835259 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-hostroot\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835272 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70645395-8d49-4495-a647-b6d43206ecbc-multus-daemon-config\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835288 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-etc-openvswitch\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835303 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70645395-8d49-4495-a647-b6d43206ecbc-cni-binary-copy\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835452 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-script-lib\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.835666 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-system-cni-dir\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836097 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/70645395-8d49-4495-a647-b6d43206ecbc-cni-binary-copy\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836224 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-multus-socket-dir-parent\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836282 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-var-lib-openvswitch\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836300 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-system-cni-dir\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836319 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-multus-conf-dir\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836335 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-env-overrides\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836360 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-ovn-kubernetes\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836367 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-openvswitch\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836380 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-slash\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836389 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-run-multus-certs\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836402 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-cnibin\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836407 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-netd\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836443 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-systemd-units\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836458 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-bin\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836463 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-node-log\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836477 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-etc-kubernetes\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836502 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-ovn\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836512 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-systemd\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836657 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-multus-cni-dir\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836681 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-cni-binary-copy\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836689 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-netns\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836725 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-run-k8s-cni-cncf-io\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836748 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-run-netns\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836905 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-os-release\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836944 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836978 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-var-lib-cni-bin\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836986 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-hostroot\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.836988 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-kubelet\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837003 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-var-lib-kubelet\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837050 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-os-release\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837065 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-host-var-lib-cni-multus\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837457 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837486 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837554 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-etc-openvswitch\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837648 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-log-socket\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837705 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/70645395-8d49-4495-a647-b6d43206ecbc-cnibin\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837788 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-config\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.837801 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/70645395-8d49-4495-a647-b6d43206ecbc-multus-daemon-config\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.840246 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5acdc04-0978-4907-bd9e-965400ded9bf-ovn-node-metrics-cert\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.841165 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.841376 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.841523 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.841679 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.841754 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:05 crc kubenswrapper[4585]: E0215 17:06:05.841970 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.843928 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.869337 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h74xg\" (UniqueName: \"kubernetes.io/projected/e5acdc04-0978-4907-bd9e-965400ded9bf-kube-api-access-h74xg\") pod \"ovnkube-node-vp6tl\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.871799 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsp49\" (UniqueName: \"kubernetes.io/projected/5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1-kube-api-access-vsp49\") pod \"multus-additional-cni-plugins-bwj9b\" (UID: \"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\") " pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.876100 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r966t\" (UniqueName: \"kubernetes.io/projected/70645395-8d49-4495-a647-b6d43206ecbc-kube-api-access-r966t\") pod \"multus-n4ps2\" (UID: \"70645395-8d49-4495-a647-b6d43206ecbc\") " pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.884675 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.886226 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.901264 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.924877 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.944360 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.944926 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n4ps2" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.945735 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.958659 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.959008 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.964543 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:05 crc kubenswrapper[4585]: I0215 17:06:05.986061 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.012658 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.027330 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.030023 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" event={"ID":"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1","Type":"ContainerStarted","Data":"6ff431d40f953fa5a43db89717d36dc4a8082d004b42a1167588052aec7dbdea"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.046816 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:05:58Z\\\",\\\"message\\\":\\\"W0215 17:05:48.038471 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0215 17:05:48.038918 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771175148 cert, and key in /tmp/serving-cert-3963139930/serving-signer.crt, /tmp/serving-cert-3963139930/serving-signer.key\\\\nI0215 17:05:48.202726 1 observer_polling.go:159] Starting file observer\\\\nW0215 17:05:48.207713 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0215 17:05:48.207968 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0215 17:05:48.209111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3963139930/tls.crt::/tmp/serving-cert-3963139930/tls.key\\\\\\\"\\\\nF0215 17:05:58.454273 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.056126 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7vtnf" event={"ID":"41851c0a-0f98-4a53-b102-505ee4f6b1ea","Type":"ContainerStarted","Data":"2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.056182 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7vtnf" event={"ID":"41851c0a-0f98-4a53-b102-505ee4f6b1ea","Type":"ContainerStarted","Data":"ead0fe145d6bd517145706346cba1a5efdbef5632ed0611d6fbd032be823a620"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.064852 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.065026 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.065037 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"9765119c1f9bac1becd41d7cc84d64c06ee83724b0306e0b3ceb88ff77ee4191"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.075857 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.077346 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:05:58Z\\\",\\\"message\\\":\\\"W0215 17:05:48.038471 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0215 17:05:48.038918 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771175148 cert, and key in /tmp/serving-cert-3963139930/serving-signer.crt, /tmp/serving-cert-3963139930/serving-signer.key\\\\nI0215 17:05:48.202726 1 observer_polling.go:159] Starting file observer\\\\nW0215 17:05:48.207713 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0215 17:05:48.207968 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0215 17:05:48.209111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3963139930/tls.crt::/tmp/serving-cert-3963139930/tls.key\\\\\\\"\\\\nF0215 17:05:58.454273 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.078524 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.084380 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4" exitCode=255 Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.084691 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.084742 4585 scope.go:117] "RemoveContainer" containerID="bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.085249 4585 scope.go:117] "RemoveContainer" containerID="0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4" Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.085492 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.086982 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4ps2" event={"ID":"70645395-8d49-4495-a647-b6d43206ecbc","Type":"ContainerStarted","Data":"b88448a38cb9423499524f02364e077bb6ce0ded7ead30409987013d8f4e8150"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.096966 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wvfh6" event={"ID":"ff52deac-c128-47a3-b1fe-15ee558b62b4","Type":"ContainerStarted","Data":"f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.097012 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wvfh6" event={"ID":"ff52deac-c128-47a3-b1fe-15ee558b62b4","Type":"ContainerStarted","Data":"7d2b641d43924bbc1e42716a6c25b366131bb07fb75f0f8dff211928f5bdebdd"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.100160 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.102727 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"d7ea8bc1b3f2011951fbe647194ae6634724839334ed4aa0d1e0449bf596d5bd"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.105277 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.105327 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1"} Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.120313 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.137499 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.155444 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.170675 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.180577 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.194527 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.213324 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.227901 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.249006 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.251278 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.251454 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:06:08.251425609 +0000 UTC m=+24.194833741 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.251580 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.251742 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.251851 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:08.251825379 +0000 UTC m=+24.195233511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.267185 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.290408 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.305096 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.324551 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.347321 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.353439 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.353691 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.353844 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.353747 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.354071 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.354170 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.353911 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.354306 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.354330 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.354079 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.354403 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:08.354380698 +0000 UTC m=+24.297788840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.354481 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:08.35445514 +0000 UTC m=+24.297863422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:06 crc kubenswrapper[4585]: E0215 17:06:06.354652 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:08.354628114 +0000 UTC m=+24.298036256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.360857 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.378527 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.391913 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.421376 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.458975 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.499836 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.539527 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.585878 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.621857 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.671660 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.707183 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:05:58Z\\\",\\\"message\\\":\\\"W0215 17:05:48.038471 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0215 17:05:48.038918 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771175148 cert, and key in /tmp/serving-cert-3963139930/serving-signer.crt, /tmp/serving-cert-3963139930/serving-signer.key\\\\nI0215 17:05:48.202726 1 observer_polling.go:159] Starting file observer\\\\nW0215 17:05:48.207713 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0215 17:05:48.207968 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0215 17:05:48.209111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3963139930/tls.crt::/tmp/serving-cert-3963139930/tls.key\\\\\\\"\\\\nF0215 17:05:58.454273 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.742752 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.760363 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 16:17:52.462281582 +0000 UTC Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.785946 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:06 crc kubenswrapper[4585]: I0215 17:06:06.829804 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:06Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.109856 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe" exitCode=0 Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.109952 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe"} Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.112178 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.115945 4585 scope.go:117] "RemoveContainer" containerID="0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4" Feb 15 17:06:07 crc kubenswrapper[4585]: E0215 17:06:07.116274 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.117242 4585 generic.go:334] "Generic (PLEG): container finished" podID="5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1" containerID="eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e" exitCode=0 Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.117311 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" event={"ID":"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1","Type":"ContainerDied","Data":"eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e"} Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.132989 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4ps2" event={"ID":"70645395-8d49-4495-a647-b6d43206ecbc","Type":"ContainerStarted","Data":"e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156"} Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.157727 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd575119ecc9e20c6aa403a91a120c4be68e3ce6660cdb3ecd824e0a4f79f645\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:05:58Z\\\",\\\"message\\\":\\\"W0215 17:05:48.038471 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0215 17:05:48.038918 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771175148 cert, and key in /tmp/serving-cert-3963139930/serving-signer.crt, /tmp/serving-cert-3963139930/serving-signer.key\\\\nI0215 17:05:48.202726 1 observer_polling.go:159] Starting file observer\\\\nW0215 17:05:48.207713 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0215 17:05:48.207968 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0215 17:05:48.209111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3963139930/tls.crt::/tmp/serving-cert-3963139930/tls.key\\\\\\\"\\\\nF0215 17:05:58.454273 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.188541 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.212629 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.223493 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.243701 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.259884 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.271720 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.291427 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.306612 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.321048 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.355528 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.375848 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.389621 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.406143 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.465037 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.509791 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.538404 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.567909 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.616092 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.652006 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.686639 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.724422 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.753253 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.760714 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 02:47:32.080863365 +0000 UTC Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.791311 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.832108 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.841508 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:07 crc kubenswrapper[4585]: E0215 17:06:07.841672 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.841734 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:07 crc kubenswrapper[4585]: E0215 17:06:07.841778 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.841826 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:07 crc kubenswrapper[4585]: E0215 17:06:07.841876 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.862548 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.906354 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.955135 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:07 crc kubenswrapper[4585]: I0215 17:06:07.978747 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:07Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.017587 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.137658 4585 generic.go:334] "Generic (PLEG): container finished" podID="5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1" containerID="0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0" exitCode=0 Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.137824 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" event={"ID":"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1","Type":"ContainerDied","Data":"0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0"} Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.144119 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.144171 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.144188 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.144202 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.144212 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.144238 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.146246 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719"} Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.166053 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.180770 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.200804 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.220274 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.235497 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.270282 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.273364 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.273571 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.273659 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:06:12.273621549 +0000 UTC m=+28.217029691 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.273681 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.273723 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:12.273712361 +0000 UTC m=+28.217120493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.298469 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.340359 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.375101 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.375153 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.375223 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375358 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375386 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375423 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375473 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:12.375458519 +0000 UTC m=+28.318866671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375544 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375556 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375566 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375593 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:12.375584602 +0000 UTC m=+28.318992744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375778 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.375894 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:12.375871029 +0000 UTC m=+28.319279181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.378667 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.425508 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.461258 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.501686 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.542490 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.551582 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.552153 4585 scope.go:117] "RemoveContainer" containerID="0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4" Feb 15 17:06:08 crc kubenswrapper[4585]: E0215 17:06:08.552288 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.585187 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.625286 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.665471 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.702189 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.740334 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.762980 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 00:26:38.336167654 +0000 UTC Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.784198 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.842957 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.863580 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.905664 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.945373 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:08 crc kubenswrapper[4585]: I0215 17:06:08.983754 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:08Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.023819 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.062687 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.099964 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.148278 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.155448 4585 generic.go:334] "Generic (PLEG): container finished" podID="5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1" containerID="651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf" exitCode=0 Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.155502 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" event={"ID":"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1","Type":"ContainerDied","Data":"651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf"} Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.184726 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.228014 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.276294 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.304980 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.341973 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.377262 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.426759 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.459691 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.497724 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.551106 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.578736 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.622181 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.661418 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.699900 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.751864 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.763768 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 23:49:28.021197812 +0000 UTC Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.783551 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.827766 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:09Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.841123 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:09 crc kubenswrapper[4585]: E0215 17:06:09.841338 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.841378 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:09 crc kubenswrapper[4585]: E0215 17:06:09.841462 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:09 crc kubenswrapper[4585]: I0215 17:06:09.841383 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:09 crc kubenswrapper[4585]: E0215 17:06:09.841665 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.162231 4585 generic.go:334] "Generic (PLEG): container finished" podID="5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1" containerID="b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0" exitCode=0 Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.162287 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" event={"ID":"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1","Type":"ContainerDied","Data":"b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0"} Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.201820 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.250031 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.295551 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.321345 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.350711 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.374514 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.401483 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.420705 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.436483 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.451194 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.463281 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.482479 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.497371 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.507616 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.520147 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:10Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.764678 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 03:05:06.144954704 +0000 UTC Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.976234 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.980405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.980463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.980480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.980688 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.991028 4585 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.991850 4585 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.995866 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.996379 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.996558 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.996737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:10 crc kubenswrapper[4585]: I0215 17:06:10.996880 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:10Z","lastTransitionTime":"2026-02-15T17:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: E0215 17:06:11.015138 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.020959 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.021019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.021037 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.021063 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.021081 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: E0215 17:06:11.036548 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.041773 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.041966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.042103 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.042264 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.042408 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: E0215 17:06:11.057789 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.062476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.062732 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.062870 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.063011 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.063128 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: E0215 17:06:11.078375 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.083084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.083119 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.083130 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.083149 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.083160 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: E0215 17:06:11.098748 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: E0215 17:06:11.098910 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.101459 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.101517 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.101529 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.101554 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.101568 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.169174 4585 generic.go:334] "Generic (PLEG): container finished" podID="5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1" containerID="d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee" exitCode=0 Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.169254 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" event={"ID":"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1","Type":"ContainerDied","Data":"d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.181238 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.192511 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.204138 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.212723 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.212766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.212780 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.212807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.212820 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.220472 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.257265 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.277441 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.291818 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.301662 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.317874 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.317913 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.317949 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.317967 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.317976 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.318144 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.335167 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.353997 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.372942 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.387188 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.400119 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.421798 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.422070 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.422403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.422548 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.422736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.422878 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.441333 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:11Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.526769 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.526821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.526836 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.526856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.526868 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.636749 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.636817 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.636837 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.636865 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.636884 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.741457 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.741525 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.741550 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.741583 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.742021 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.766077 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 13:51:51.650429996 +0000 UTC Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.841066 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:11 crc kubenswrapper[4585]: E0215 17:06:11.841298 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.841676 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:11 crc kubenswrapper[4585]: E0215 17:06:11.841928 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.842087 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:11 crc kubenswrapper[4585]: E0215 17:06:11.842333 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.845044 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.845084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.845102 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.845137 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.845158 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.949713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.949776 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.949793 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.949817 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:11 crc kubenswrapper[4585]: I0215 17:06:11.949837 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:11Z","lastTransitionTime":"2026-02-15T17:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.054021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.054084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.054103 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.054139 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.054166 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.158106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.158175 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.158192 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.158220 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.158247 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.202632 4585 generic.go:334] "Generic (PLEG): container finished" podID="5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1" containerID="f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c" exitCode=0 Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.202743 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" event={"ID":"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1","Type":"ContainerDied","Data":"f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.224366 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.253116 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.264106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.264223 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.264245 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.264272 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.264289 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.286961 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.304093 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.321283 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.321654 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:06:20.321537696 +0000 UTC m=+36.264945868 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.321783 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.322970 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.322950 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.323085 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:20.323058473 +0000 UTC m=+36.266466625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.340202 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.368429 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.369072 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.369093 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.369101 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.369126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.369137 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.384315 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.400184 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.423155 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.423218 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.423262 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.423452 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.423474 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.423489 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.423554 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:20.42353288 +0000 UTC m=+36.366941022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.424021 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.424038 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.424050 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.424080 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:20.424070583 +0000 UTC m=+36.367478725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.424137 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:12 crc kubenswrapper[4585]: E0215 17:06:12.424163 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:20.424155145 +0000 UTC m=+36.367563287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.424496 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.446890 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.462227 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.473003 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.473051 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.473062 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.473081 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.473092 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.477933 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.492094 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.509272 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:12Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.575234 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.575268 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.575280 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.575300 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.575316 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.678375 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.678415 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.678426 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.678445 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.678458 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.766978 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 17:56:20.406771367 +0000 UTC Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.783637 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.783717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.783737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.783769 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.783789 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.886889 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.887006 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.887028 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.887058 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.887078 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.990052 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.990123 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.990143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.990185 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:12 crc kubenswrapper[4585]: I0215 17:06:12.990209 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:12Z","lastTransitionTime":"2026-02-15T17:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.093242 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.093309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.093323 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.093348 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.093362 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:13Z","lastTransitionTime":"2026-02-15T17:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.196343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.196390 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.196403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.196437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.196451 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:13Z","lastTransitionTime":"2026-02-15T17:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.213322 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" event={"ID":"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1","Type":"ContainerStarted","Data":"878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.220405 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.221361 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.221434 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.239554 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.265012 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.266666 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.266833 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.282871 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.300475 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.300549 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.300569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.300626 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.300649 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:13Z","lastTransitionTime":"2026-02-15T17:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.302757 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.320526 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.342116 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.366419 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.404232 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.404499 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.404594 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.404638 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.404668 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.404686 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:13Z","lastTransitionTime":"2026-02-15T17:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.427572 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.451726 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.466971 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.493715 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.508242 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.508325 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.508345 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.508437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.508458 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:13Z","lastTransitionTime":"2026-02-15T17:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.519528 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.541139 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.567246 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.583197 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.601588 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.612200 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.612331 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.612357 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.612404 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.612510 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:13Z","lastTransitionTime":"2026-02-15T17:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.621825 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.647891 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.668572 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.694501 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.715087 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.716634 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.716705 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.716724 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.716751 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.716770 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:13Z","lastTransitionTime":"2026-02-15T17:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.739633 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.760838 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.767893 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 01:43:01.137666417 +0000 UTC Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.775568 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.805067 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.820054 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.820117 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.820135 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.820162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.820180 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:13Z","lastTransitionTime":"2026-02-15T17:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.822928 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.841196 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.841256 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.841282 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:13 crc kubenswrapper[4585]: E0215 17:06:13.841540 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:13 crc kubenswrapper[4585]: E0215 17:06:13.841723 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:13 crc kubenswrapper[4585]: E0215 17:06:13.841877 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.843364 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.858554 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.877873 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:13Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.923311 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.923363 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.923376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.923397 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:13 crc kubenswrapper[4585]: I0215 17:06:13.923416 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:13Z","lastTransitionTime":"2026-02-15T17:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.027263 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.027344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.027369 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.027406 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.027432 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.100312 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.130402 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.130456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.130468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.130487 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.130505 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.233080 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.233171 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.233195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.233233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.233265 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.336265 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.336319 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.336340 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.336367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.336387 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.439665 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.439750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.439770 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.439799 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.439820 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.544082 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.544158 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.544177 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.544214 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.544236 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.574321 4585 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.654099 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.655253 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.655270 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.655295 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.655306 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.758485 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.758566 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.758587 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.758697 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.758725 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.768470 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:17:52.200952257 +0000 UTC Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.861341 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.861692 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.861794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.861902 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.861993 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.879458 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:14Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.902078 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:14Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.929691 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:14Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.947098 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:14Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.963794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.963838 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.963848 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.963867 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.963890 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:14Z","lastTransitionTime":"2026-02-15T17:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.964259 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:14Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:14 crc kubenswrapper[4585]: I0215 17:06:14.984522 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:14Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:15 crc kubenswrapper[4585]: I0215 17:06:14.999997 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:14Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:15 crc kubenswrapper[4585]: I0215 17:06:15.010150 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:15Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:15 crc kubenswrapper[4585]: I0215 17:06:15.024371 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:15Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:15 crc kubenswrapper[4585]: I0215 17:06:15.040722 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:15Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.009460 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:17 crc kubenswrapper[4585]: E0215 17:06:17.009773 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.010500 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:17 crc kubenswrapper[4585]: E0215 17:06:17.010639 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.010719 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:17 crc kubenswrapper[4585]: E0215 17:06:17.010807 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.012342 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 17:21:55.053512157 +0000 UTC Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.027514 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.028115 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.028138 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.028170 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.028197 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.047460 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:17Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.088575 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" probeResult="failure" output="" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.091216 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:17Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.126337 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:17Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.142905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.142953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.142965 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.142985 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.142997 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.152994 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:17Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.172477 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:17Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.245443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.245496 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.245512 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.245539 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.245556 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.361053 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.361107 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.361118 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.361143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.361159 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.464258 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.464317 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.464335 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.464365 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.464387 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.567593 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.567704 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.567723 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.567757 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.567781 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.670634 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.670692 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.670704 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.670726 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.670762 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.775068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.775138 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.775162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.775202 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.775223 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.879677 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.879720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.879729 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.879746 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.879758 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.984343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.984400 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.984417 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.984445 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:17 crc kubenswrapper[4585]: I0215 17:06:17.984465 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:17Z","lastTransitionTime":"2026-02-15T17:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.012695 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 03:28:07.268231617 +0000 UTC Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.038592 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/0.log" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.045030 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa" exitCode=1 Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.045109 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.046440 4585 scope.go:117] "RemoveContainer" containerID="0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.068231 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.091155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.091222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.091247 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.091286 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.091336 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:18Z","lastTransitionTime":"2026-02-15T17:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.101545 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.121576 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.143337 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.167188 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.182971 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.196548 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.196630 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.196655 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.196685 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.196704 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:18Z","lastTransitionTime":"2026-02-15T17:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.209149 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:17Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0215 17:06:17.840095 5814 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0215 17:06:17.841906 5814 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0215 17:06:17.841947 5814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0215 17:06:17.841957 5814 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0215 17:06:17.842020 5814 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0215 17:06:17.842034 5814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0215 17:06:17.842067 5814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0215 17:06:17.842063 5814 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0215 17:06:17.842089 5814 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0215 17:06:17.842097 5814 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0215 17:06:17.842107 5814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0215 17:06:17.842453 5814 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.234945 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.256863 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.276644 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.295792 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.301071 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.301137 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.301155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.301186 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.301207 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:18Z","lastTransitionTime":"2026-02-15T17:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.329380 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.345429 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.357587 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.374621 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:18Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.404587 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.404697 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.404717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.404750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.404771 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:18Z","lastTransitionTime":"2026-02-15T17:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.507846 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.507957 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.507983 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.508019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.508044 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:18Z","lastTransitionTime":"2026-02-15T17:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.611276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.611342 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.611355 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.611378 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.611394 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:18Z","lastTransitionTime":"2026-02-15T17:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.714618 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.714675 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.714688 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.714713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.714729 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:18Z","lastTransitionTime":"2026-02-15T17:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.818120 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.818175 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.818188 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.818211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.818226 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:18Z","lastTransitionTime":"2026-02-15T17:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.841135 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.841194 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.841293 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:18 crc kubenswrapper[4585]: E0215 17:06:18.841472 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:18 crc kubenswrapper[4585]: E0215 17:06:18.841660 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:18 crc kubenswrapper[4585]: E0215 17:06:18.841850 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.920945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.921020 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.921053 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.921079 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:18 crc kubenswrapper[4585]: I0215 17:06:18.921092 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:18Z","lastTransitionTime":"2026-02-15T17:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.013914 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 04:54:25.687353062 +0000 UTC Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.024147 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.024187 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.024195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.024213 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.024224 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.051259 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/0.log" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.054746 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.055322 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.076253 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.097908 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.121866 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.126522 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.126589 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.126632 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.126665 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.126684 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.142442 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.155998 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.169136 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.185036 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.197464 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.218517 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.229465 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.229520 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.229533 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.229553 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.229564 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.241573 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.260543 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:17Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0215 17:06:17.840095 5814 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0215 17:06:17.841906 5814 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0215 17:06:17.841947 5814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0215 17:06:17.841957 5814 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0215 17:06:17.842020 5814 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0215 17:06:17.842034 5814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0215 17:06:17.842067 5814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0215 17:06:17.842063 5814 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0215 17:06:17.842089 5814 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0215 17:06:17.842097 5814 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0215 17:06:17.842107 5814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0215 17:06:17.842453 5814 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.276635 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.290555 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.308265 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.323215 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:19Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.332204 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.332231 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.332238 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.332254 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.332265 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.435507 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.435574 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.435584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.435626 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.435639 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.538876 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.538934 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.538953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.538983 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.539006 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.643387 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.643437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.643451 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.643472 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.643490 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.747147 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.747229 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.747254 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.747291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.747318 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.842241 4585 scope.go:117] "RemoveContainer" containerID="0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.849468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.849540 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.849557 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.849589 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.849654 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.955026 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.955106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.955126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.955157 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:19 crc kubenswrapper[4585]: I0215 17:06:19.955184 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:19Z","lastTransitionTime":"2026-02-15T17:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.014330 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 06:29:44.509758471 +0000 UTC Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.058117 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.058157 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.058166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.058184 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.058195 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:20Z","lastTransitionTime":"2026-02-15T17:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.061261 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/1.log" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.062145 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/0.log" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.065564 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359" exitCode=1 Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.065627 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.065676 4585 scope.go:117] "RemoveContainer" containerID="0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.066429 4585 scope.go:117] "RemoveContainer" containerID="8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359" Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.066652 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.090891 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.109208 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.127130 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.146788 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.154309 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj"] Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.154984 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.158770 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.159671 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.160463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.160539 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.160628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.160709 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.160765 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:20Z","lastTransitionTime":"2026-02-15T17:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.167697 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.186416 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.211033 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.229507 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.248934 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.253067 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75f106e-0b6d-4a73-8b99-66ca018fec60-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.253155 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b75f106e-0b6d-4a73-8b99-66ca018fec60-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.253196 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtg45\" (UniqueName: \"kubernetes.io/projected/b75f106e-0b6d-4a73-8b99-66ca018fec60-kube-api-access-rtg45\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.253266 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b75f106e-0b6d-4a73-8b99-66ca018fec60-env-overrides\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.263399 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.263442 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.263462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.263480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.263490 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:20Z","lastTransitionTime":"2026-02-15T17:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.265895 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.293219 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:17Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0215 17:06:17.840095 5814 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0215 17:06:17.841906 5814 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0215 17:06:17.841947 5814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0215 17:06:17.841957 5814 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0215 17:06:17.842020 5814 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0215 17:06:17.842034 5814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0215 17:06:17.842067 5814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0215 17:06:17.842063 5814 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0215 17:06:17.842089 5814 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0215 17:06:17.842097 5814 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0215 17:06:17.842107 5814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0215 17:06:17.842453 5814 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:19Z\\\",\\\"message\\\":\\\": *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0215 17:06:19.179582 5957 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wvfh6 in node crc\\\\nI0215 17:06:19.180304 5957 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0215 17:06:19.180307 5957 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-wvfh6 after 0 failed attempt(s)\\\\nI0215 17:06:19.180313 5957 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-wvfh6\\\\nI0215 17:06:19.180314 5957 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0215 17:06:19.180210 5957 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.312836 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.330989 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.347494 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.353948 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.354180 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:06:36.354129294 +0000 UTC m=+52.297537426 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.354309 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75f106e-0b6d-4a73-8b99-66ca018fec60-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.354407 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b75f106e-0b6d-4a73-8b99-66ca018fec60-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.354516 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtg45\" (UniqueName: \"kubernetes.io/projected/b75f106e-0b6d-4a73-8b99-66ca018fec60-kube-api-access-rtg45\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.354639 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.354802 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.354917 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:36.354904302 +0000 UTC m=+52.298312524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.354855 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b75f106e-0b6d-4a73-8b99-66ca018fec60-env-overrides\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.355192 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b75f106e-0b6d-4a73-8b99-66ca018fec60-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.355586 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b75f106e-0b6d-4a73-8b99-66ca018fec60-env-overrides\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.360582 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.367972 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b75f106e-0b6d-4a73-8b99-66ca018fec60-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.368780 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.368815 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.368825 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.368843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.368854 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:20Z","lastTransitionTime":"2026-02-15T17:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.370794 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtg45\" (UniqueName: \"kubernetes.io/projected/b75f106e-0b6d-4a73-8b99-66ca018fec60-kube-api-access-rtg45\") pod \"ovnkube-control-plane-749d76644c-klxmj\" (UID: \"b75f106e-0b6d-4a73-8b99-66ca018fec60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.378542 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.390912 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.405103 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.419106 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.432364 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.448314 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.455592 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.455784 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.455818 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.455833 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.455892 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:36.455872592 +0000 UTC m=+52.399280724 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.455969 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.456043 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.456172 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.456244 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.456263 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.456324 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.456366 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:36.456340034 +0000 UTC m=+52.399748176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.456435 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:36.456424066 +0000 UTC m=+52.399832218 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.462362 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.471263 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.471298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.471310 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.471328 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.471340 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:20Z","lastTransitionTime":"2026-02-15T17:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.474639 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.492408 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.513922 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.529884 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.540316 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.564662 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.620880 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.620924 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.620934 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.620953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.620964 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:20Z","lastTransitionTime":"2026-02-15T17:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.627884 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.643670 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.660459 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:17Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0215 17:06:17.840095 5814 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0215 17:06:17.841906 5814 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0215 17:06:17.841947 5814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0215 17:06:17.841957 5814 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0215 17:06:17.842020 5814 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0215 17:06:17.842034 5814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0215 17:06:17.842067 5814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0215 17:06:17.842063 5814 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0215 17:06:17.842089 5814 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0215 17:06:17.842097 5814 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0215 17:06:17.842107 5814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0215 17:06:17.842453 5814 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:19Z\\\",\\\"message\\\":\\\": *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0215 17:06:19.179582 5957 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wvfh6 in node crc\\\\nI0215 17:06:19.180304 5957 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0215 17:06:19.180307 5957 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-wvfh6 after 0 failed attempt(s)\\\\nI0215 17:06:19.180313 5957 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-wvfh6\\\\nI0215 17:06:19.180314 5957 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0215 17:06:19.180210 5957 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.671047 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.723709 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.723743 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.723761 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.723778 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.723788 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:20Z","lastTransitionTime":"2026-02-15T17:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.826039 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.826281 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.826344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.826458 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.826517 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:20Z","lastTransitionTime":"2026-02-15T17:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.841425 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.841662 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.842220 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.842336 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.842536 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.842666 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.929729 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.929962 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.930023 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.930134 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.930237 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:20Z","lastTransitionTime":"2026-02-15T17:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.950837 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gclkf"] Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.951366 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:20 crc kubenswrapper[4585]: E0215 17:06:20.951432 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.959931 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.959974 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4lt\" (UniqueName: \"kubernetes.io/projected/ee2e2535-c7ad-42e7-930b-8e0471dfca11-kube-api-access-nl4lt\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.968166 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:20 crc kubenswrapper[4585]: I0215 17:06:20.991108 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ac3244f4ab75f243162d25ee98bbe0a28e22fb8f9a7aaf9c3e244273834adfa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:17Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0215 17:06:17.840095 5814 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0215 17:06:17.841906 5814 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0215 17:06:17.841947 5814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0215 17:06:17.841957 5814 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0215 17:06:17.842020 5814 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0215 17:06:17.842034 5814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0215 17:06:17.842067 5814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0215 17:06:17.842063 5814 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0215 17:06:17.842089 5814 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0215 17:06:17.842097 5814 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0215 17:06:17.842107 5814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0215 17:06:17.842453 5814 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:19Z\\\",\\\"message\\\":\\\": *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0215 17:06:19.179582 5957 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wvfh6 in node crc\\\\nI0215 17:06:19.180304 5957 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0215 17:06:19.180307 5957 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-wvfh6 after 0 failed attempt(s)\\\\nI0215 17:06:19.180313 5957 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-wvfh6\\\\nI0215 17:06:19.180314 5957 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0215 17:06:19.180210 5957 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:20Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.005754 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.014503 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 16:40:53.122981194 +0000 UTC Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.018490 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.032187 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.032218 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.032227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.032245 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.032257 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.033723 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.047414 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.059462 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.060703 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.060733 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4lt\" (UniqueName: \"kubernetes.io/projected/ee2e2535-c7ad-42e7-930b-8e0471dfca11-kube-api-access-nl4lt\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.061037 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.061086 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs podName:ee2e2535-c7ad-42e7-930b-8e0471dfca11 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:21.561070184 +0000 UTC m=+37.504478316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs") pod "network-metrics-daemon-gclkf" (UID: "ee2e2535-c7ad-42e7-930b-8e0471dfca11") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.069692 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.071155 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.071782 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.074402 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" event={"ID":"b75f106e-0b6d-4a73-8b99-66ca018fec60","Type":"ContainerStarted","Data":"ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.074546 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" event={"ID":"b75f106e-0b6d-4a73-8b99-66ca018fec60","Type":"ContainerStarted","Data":"47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.074633 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" event={"ID":"b75f106e-0b6d-4a73-8b99-66ca018fec60","Type":"ContainerStarted","Data":"073e58677644b9f4866bf37059c4e87832f56c76676af4e6275196c8be4ebfab"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.080735 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4lt\" (UniqueName: \"kubernetes.io/projected/ee2e2535-c7ad-42e7-930b-8e0471dfca11-kube-api-access-nl4lt\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.082235 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/1.log" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.085322 4585 scope.go:117] "RemoveContainer" containerID="8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359" Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.085486 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.085985 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.099084 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.110904 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.124183 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.126938 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.126978 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.126988 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.127010 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.127020 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.136149 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.139489 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.142165 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.142259 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.142330 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.142450 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.142522 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.147635 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.153663 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.160960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.161006 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.161019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.161041 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.161055 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.165766 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.174922 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.177979 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.178080 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.178136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.178198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.178326 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.179991 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.189977 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.193469 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.196798 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.196835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.196863 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.196886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.196900 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.208841 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.210106 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.210221 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.211883 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.211933 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.211943 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.211960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.211976 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.224618 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.238485 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.252312 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.263305 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.277430 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.293710 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.311399 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:19Z\\\",\\\"message\\\":\\\": *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0215 17:06:19.179582 5957 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wvfh6 in node crc\\\\nI0215 17:06:19.180304 5957 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0215 17:06:19.180307 5957 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-wvfh6 after 0 failed attempt(s)\\\\nI0215 17:06:19.180313 5957 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-wvfh6\\\\nI0215 17:06:19.180314 5957 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0215 17:06:19.180210 5957 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.314674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.314708 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.314720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.314740 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.314754 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.328581 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.342263 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.358227 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.370917 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.391556 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.404505 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.416492 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.417226 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.417355 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.417653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.417823 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.417978 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.433861 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.474746 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.500020 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:21Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.520836 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.520911 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.520927 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.520955 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.520971 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.566903 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.567183 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:21 crc kubenswrapper[4585]: E0215 17:06:21.567350 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs podName:ee2e2535-c7ad-42e7-930b-8e0471dfca11 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:22.567313766 +0000 UTC m=+38.510721968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs") pod "network-metrics-daemon-gclkf" (UID: "ee2e2535-c7ad-42e7-930b-8e0471dfca11") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.624306 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.624377 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.624396 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.624430 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.624451 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.728201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.728283 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.728302 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.728333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.728353 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.830966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.831211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.831273 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.831341 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.831407 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.934315 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.934412 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.934431 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.934462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:21 crc kubenswrapper[4585]: I0215 17:06:21.934485 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:21Z","lastTransitionTime":"2026-02-15T17:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.015132 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:36:49.473365962 +0000 UTC Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.045030 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.045105 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.045124 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.045159 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.045199 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.149714 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.149776 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.149793 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.149821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.149839 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.256335 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.256717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.256860 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.256994 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.257211 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.360629 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.360705 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.360725 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.360754 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.360775 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.464161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.464674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.464914 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.465134 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.465287 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.568730 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.568797 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.568816 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.568845 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.568868 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.581133 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:22 crc kubenswrapper[4585]: E0215 17:06:22.581394 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:22 crc kubenswrapper[4585]: E0215 17:06:22.581520 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs podName:ee2e2535-c7ad-42e7-930b-8e0471dfca11 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:24.581489632 +0000 UTC m=+40.524897804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs") pod "network-metrics-daemon-gclkf" (UID: "ee2e2535-c7ad-42e7-930b-8e0471dfca11") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.672136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.672188 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.672198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.672215 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.672226 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.776106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.776176 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.776188 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.776209 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.776223 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.841196 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:22 crc kubenswrapper[4585]: E0215 17:06:22.841686 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.841993 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.842054 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.842021 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:22 crc kubenswrapper[4585]: E0215 17:06:22.842262 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:22 crc kubenswrapper[4585]: E0215 17:06:22.842535 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:22 crc kubenswrapper[4585]: E0215 17:06:22.842703 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.879447 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.879498 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.879514 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.879538 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.879559 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.982842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.982911 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.982930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.982960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:22 crc kubenswrapper[4585]: I0215 17:06:22.982978 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:22Z","lastTransitionTime":"2026-02-15T17:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.015734 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 05:08:42.405047761 +0000 UTC Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.086750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.086826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.086843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.086873 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.086895 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:23Z","lastTransitionTime":"2026-02-15T17:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.189893 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.189961 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.189978 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.190009 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.190033 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:23Z","lastTransitionTime":"2026-02-15T17:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.293521 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.293577 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.293639 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.293674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.293698 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:23Z","lastTransitionTime":"2026-02-15T17:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.396976 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.397036 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.397054 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.397084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.397105 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:23Z","lastTransitionTime":"2026-02-15T17:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.500896 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.500956 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.500971 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.500998 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.501017 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:23Z","lastTransitionTime":"2026-02-15T17:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.604689 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.604750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.604767 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.604793 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.604808 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:23Z","lastTransitionTime":"2026-02-15T17:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.708999 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.709076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.709094 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.709121 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.709144 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:23Z","lastTransitionTime":"2026-02-15T17:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.812842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.812901 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.812918 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.812944 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.812965 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:23Z","lastTransitionTime":"2026-02-15T17:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.916765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.917124 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.917299 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.917450 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:23 crc kubenswrapper[4585]: I0215 17:06:23.917592 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:23Z","lastTransitionTime":"2026-02-15T17:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.016689 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 04:04:03.151264306 +0000 UTC Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.021577 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.021653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.021673 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.021702 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.021719 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.124714 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.124962 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.124984 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.125016 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.125039 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.230724 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.230801 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.230818 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.230846 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.230865 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.334340 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.334393 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.334409 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.334436 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.334454 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.437746 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.437831 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.437857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.437892 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.437913 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.542196 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.542276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.542299 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.542334 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.542359 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.605543 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:24 crc kubenswrapper[4585]: E0215 17:06:24.605808 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:24 crc kubenswrapper[4585]: E0215 17:06:24.605891 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs podName:ee2e2535-c7ad-42e7-930b-8e0471dfca11 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:28.605863667 +0000 UTC m=+44.549271839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs") pod "network-metrics-daemon-gclkf" (UID: "ee2e2535-c7ad-42e7-930b-8e0471dfca11") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.644808 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.644878 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.644901 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.644940 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.644967 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.748208 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.748261 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.748282 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.748314 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.748337 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.841500 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:24 crc kubenswrapper[4585]: E0215 17:06:24.842106 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.841773 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.841717 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:24 crc kubenswrapper[4585]: E0215 17:06:24.842889 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:24 crc kubenswrapper[4585]: E0215 17:06:24.842699 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.842571 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:24 crc kubenswrapper[4585]: E0215 17:06:24.843562 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.852165 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.852225 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.852246 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.852277 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.852296 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.856397 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:24Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.888425 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:19Z\\\",\\\"message\\\":\\\": *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0215 17:06:19.179582 5957 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wvfh6 in node crc\\\\nI0215 17:06:19.180304 5957 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0215 17:06:19.180307 5957 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-wvfh6 after 0 failed attempt(s)\\\\nI0215 17:06:19.180313 5957 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-wvfh6\\\\nI0215 17:06:19.180314 5957 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0215 17:06:19.180210 5957 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:24Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.904526 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:24Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.928022 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:24Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.948089 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:24Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.957295 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.957351 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.957372 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.957399 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.957460 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:24Z","lastTransitionTime":"2026-02-15T17:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.963746 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:24Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.981194 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:24Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:24 crc kubenswrapper[4585]: I0215 17:06:24.995147 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:24Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.013338 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:25Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.016848 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 16:31:08.022668477 +0000 UTC Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.038268 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:25Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.053366 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:25Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.059884 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.059967 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.059979 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.059996 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.060031 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.063251 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:25Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.072403 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:25Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.087149 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:25Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.100431 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:25Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.121005 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:25Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.134809 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:25Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.163303 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.163356 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.163367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.163388 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.163401 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.266662 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.266701 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.266710 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.266726 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.266738 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.369994 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.370053 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.370070 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.370095 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.370113 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.472857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.472919 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.472941 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.472969 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.472989 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.576378 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.576456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.576477 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.576509 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.576529 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.680261 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.680308 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.680327 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.680350 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.680370 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.783303 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.783403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.783429 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.783463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.783488 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.886791 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.886835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.886847 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.886866 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.886879 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.990486 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.990555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.990572 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.990629 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:25 crc kubenswrapper[4585]: I0215 17:06:25.990649 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:25Z","lastTransitionTime":"2026-02-15T17:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.017907 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 20:54:20.023701426 +0000 UTC Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.094707 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.094813 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.094832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.094862 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.094880 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:26Z","lastTransitionTime":"2026-02-15T17:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.198591 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.198753 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.198775 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.198802 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.198823 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:26Z","lastTransitionTime":"2026-02-15T17:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.302910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.302976 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.302993 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.303021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.303040 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:26Z","lastTransitionTime":"2026-02-15T17:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.406987 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.407058 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.407076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.407110 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.407134 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:26Z","lastTransitionTime":"2026-02-15T17:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.510181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.510246 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.510263 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.510290 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.510308 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:26Z","lastTransitionTime":"2026-02-15T17:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.613769 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.613818 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.613834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.613858 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.613875 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:26Z","lastTransitionTime":"2026-02-15T17:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.718034 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.718109 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.718136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.718168 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.718193 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:26Z","lastTransitionTime":"2026-02-15T17:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.822396 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.822469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.822487 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.822516 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.822537 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:26Z","lastTransitionTime":"2026-02-15T17:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.840996 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.841138 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.841029 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:26 crc kubenswrapper[4585]: E0215 17:06:26.841229 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:26 crc kubenswrapper[4585]: E0215 17:06:26.841407 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.841526 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:26 crc kubenswrapper[4585]: E0215 17:06:26.841555 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:26 crc kubenswrapper[4585]: E0215 17:06:26.841721 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.926261 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.926441 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.926472 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.926557 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:26 crc kubenswrapper[4585]: I0215 17:06:26.926658 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:26Z","lastTransitionTime":"2026-02-15T17:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.018948 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 20:18:50.724857649 +0000 UTC Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.030344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.030426 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.030444 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.030476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.030497 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.133522 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.133593 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.133647 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.133683 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.133702 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.236841 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.236912 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.236936 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.236971 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.236994 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.340922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.340982 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.340999 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.341027 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.341050 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.444327 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.444396 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.444415 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.444443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.444467 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.548557 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.548652 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.548668 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.548695 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.548714 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.652150 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.652207 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.652223 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.652248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.652267 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.755763 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.755876 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.755897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.755945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.755968 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.862808 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.862929 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.862956 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.862988 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.863015 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.965895 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.965983 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.966005 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.966034 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:27 crc kubenswrapper[4585]: I0215 17:06:27.966055 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:27Z","lastTransitionTime":"2026-02-15T17:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.019971 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 18:37:57.561166308 +0000 UTC Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.070166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.070220 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.070240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.070267 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.070289 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:28Z","lastTransitionTime":"2026-02-15T17:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.174059 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.174138 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.174159 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.174194 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.174216 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:28Z","lastTransitionTime":"2026-02-15T17:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.278558 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.278666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.278692 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.278731 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.278750 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:28Z","lastTransitionTime":"2026-02-15T17:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.381879 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.381966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.381988 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.382020 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.382042 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:28Z","lastTransitionTime":"2026-02-15T17:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.485843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.485926 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.485945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.485977 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.485997 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:28Z","lastTransitionTime":"2026-02-15T17:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.589722 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.590146 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.590571 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.591953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.592333 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:28Z","lastTransitionTime":"2026-02-15T17:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.683439 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:28 crc kubenswrapper[4585]: E0215 17:06:28.684065 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:28 crc kubenswrapper[4585]: E0215 17:06:28.684288 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs podName:ee2e2535-c7ad-42e7-930b-8e0471dfca11 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:36.684253559 +0000 UTC m=+52.627661731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs") pod "network-metrics-daemon-gclkf" (UID: "ee2e2535-c7ad-42e7-930b-8e0471dfca11") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.696696 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.697002 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.697100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.697230 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.697354 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:28Z","lastTransitionTime":"2026-02-15T17:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.801381 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.801661 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.801875 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.801964 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.802038 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:28Z","lastTransitionTime":"2026-02-15T17:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.840812 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:28 crc kubenswrapper[4585]: E0215 17:06:28.841323 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.841148 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:28 crc kubenswrapper[4585]: E0215 17:06:28.842119 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.841473 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:28 crc kubenswrapper[4585]: E0215 17:06:28.842393 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.841097 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:28 crc kubenswrapper[4585]: E0215 17:06:28.842669 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.905747 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.906041 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.906125 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.906214 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:28 crc kubenswrapper[4585]: I0215 17:06:28.906294 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:28Z","lastTransitionTime":"2026-02-15T17:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.009550 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.009647 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.009695 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.009724 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.009745 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.021172 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 22:42:19.350312359 +0000 UTC Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.112791 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.112851 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.112869 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.112896 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.112920 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.216438 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.216482 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.216505 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.216533 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.216551 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.320562 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.320693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.320718 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.320762 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.320790 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.424011 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.424077 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.424095 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.424123 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.424145 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.527179 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.527240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.527260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.527286 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.527304 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.631022 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.631278 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.631373 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.631446 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.631510 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.735478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.735548 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.735566 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.735595 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.735642 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.839221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.839301 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.839321 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.839358 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.839379 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.942858 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.942945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.942970 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.943001 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:29 crc kubenswrapper[4585]: I0215 17:06:29.943029 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:29Z","lastTransitionTime":"2026-02-15T17:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.022078 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 03:26:58.618155184 +0000 UTC Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.046081 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.046160 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.046181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.046214 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.046236 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.150179 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.150546 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.150713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.150897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.151041 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.254870 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.254944 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.254968 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.255006 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.255035 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.358067 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.358161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.358189 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.358221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.358239 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.461849 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.461932 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.461950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.461981 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.462001 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.566004 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.566088 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.566109 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.566138 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.566158 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.669532 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.669921 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.670100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.670262 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.670419 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.773584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.773875 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.773945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.774040 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.774128 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.842473 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.842759 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.842850 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.843026 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:30 crc kubenswrapper[4585]: E0215 17:06:30.843468 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:30 crc kubenswrapper[4585]: E0215 17:06:30.843815 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:30 crc kubenswrapper[4585]: E0215 17:06:30.843967 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:30 crc kubenswrapper[4585]: E0215 17:06:30.843721 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.877492 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.877558 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.877626 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.877666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.877691 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.982586 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.983052 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.983279 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.983553 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:30 crc kubenswrapper[4585]: I0215 17:06:30.983724 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:30Z","lastTransitionTime":"2026-02-15T17:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.023576 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 23:39:34.342781297 +0000 UTC Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.087493 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.087554 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.087569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.087592 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.087627 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.190594 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.191116 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.191796 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.191863 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.191886 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.295041 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.295466 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.295671 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.295872 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.296060 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.399864 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.399915 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.399936 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.399963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.399983 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.471804 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.471887 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.471905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.471933 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.471955 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: E0215 17:06:31.492968 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.498703 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.498765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.498783 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.498810 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.498829 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: E0215 17:06:31.522551 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.529067 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.529150 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.529169 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.529200 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.529221 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: E0215 17:06:31.550158 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.555379 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.555446 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.555463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.555492 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.555513 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: E0215 17:06:31.575736 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.581492 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.581652 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.581906 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.582163 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.582383 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: E0215 17:06:31.598244 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: E0215 17:06:31.598674 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.600698 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.600745 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.600762 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.600790 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.600808 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.704247 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.704530 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.704689 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.704887 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.705023 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.808842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.809102 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.809207 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.809351 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.809469 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.875130 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.894349 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.912167 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.912229 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.912250 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.912276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.912295 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:31Z","lastTransitionTime":"2026-02-15T17:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.925151 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.942461 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.960347 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.980331 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:31 crc kubenswrapper[4585]: I0215 17:06:31.993409 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:31Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.014848 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.014915 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.014939 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.014966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.014984 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.020647 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.024159 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 05:35:31.893496833 +0000 UTC Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.034167 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.052817 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.066962 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.088877 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.107165 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.118413 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.118467 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.118494 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.118527 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.118549 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.137005 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:19Z\\\",\\\"message\\\":\\\": *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0215 17:06:19.179582 5957 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wvfh6 in node crc\\\\nI0215 17:06:19.180304 5957 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0215 17:06:19.180307 5957 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-wvfh6 after 0 failed attempt(s)\\\\nI0215 17:06:19.180313 5957 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-wvfh6\\\\nI0215 17:06:19.180314 5957 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0215 17:06:19.180210 5957 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.157312 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.176490 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.193518 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.213671 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:32Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.222369 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.222449 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.222474 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.222999 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.223294 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.326750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.326806 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.326825 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.326851 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.326866 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.429695 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.429749 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.429762 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.429785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.429798 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.532007 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.532057 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.532067 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.532085 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.532100 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.635533 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.635632 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.635652 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.635684 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.635706 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.739322 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.739402 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.739421 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.739450 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.739469 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.841068 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.841139 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.841206 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.841267 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:32 crc kubenswrapper[4585]: E0215 17:06:32.841299 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:32 crc kubenswrapper[4585]: E0215 17:06:32.841506 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:32 crc kubenswrapper[4585]: E0215 17:06:32.841777 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:32 crc kubenswrapper[4585]: E0215 17:06:32.841935 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.843586 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.843672 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.843690 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.843720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.843740 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.946961 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.947037 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.947059 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.947089 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:32 crc kubenswrapper[4585]: I0215 17:06:32.947106 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:32Z","lastTransitionTime":"2026-02-15T17:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.024992 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 03:53:17.39243285 +0000 UTC Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.050472 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.050575 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.050594 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.050657 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.050680 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:33Z","lastTransitionTime":"2026-02-15T17:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.153822 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.153887 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.153913 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.154001 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.154022 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:33Z","lastTransitionTime":"2026-02-15T17:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.256987 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.257092 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.257108 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.257131 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.257145 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:33Z","lastTransitionTime":"2026-02-15T17:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.409000 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.409132 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.409151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.409187 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.409208 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:33Z","lastTransitionTime":"2026-02-15T17:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.512540 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.512651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.512674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.512705 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.512726 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:33Z","lastTransitionTime":"2026-02-15T17:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.616496 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.616579 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.616648 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.616684 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.616727 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:33Z","lastTransitionTime":"2026-02-15T17:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.720402 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.720478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.720496 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.720528 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.720573 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:33Z","lastTransitionTime":"2026-02-15T17:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.824403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.824486 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.824510 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.824546 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.824564 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:33Z","lastTransitionTime":"2026-02-15T17:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.928645 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.928700 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.928717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.928742 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:33 crc kubenswrapper[4585]: I0215 17:06:33.928762 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:33Z","lastTransitionTime":"2026-02-15T17:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.025889 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 03:29:41.229742343 +0000 UTC Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.032701 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.032765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.032784 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.032812 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.032831 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.139221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.139283 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.139302 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.139333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.139353 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.243658 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.243719 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.243736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.243762 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.243781 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.347922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.347995 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.348013 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.348044 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.348063 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.451457 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.451535 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.451555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.451591 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.451637 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.554872 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.554941 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.554958 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.554985 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.555008 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.658905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.659172 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.659207 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.659240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.659263 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.762817 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.762889 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.762907 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.762930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.762947 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.840848 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.840962 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.840982 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:34 crc kubenswrapper[4585]: E0215 17:06:34.841376 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.841502 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:34 crc kubenswrapper[4585]: E0215 17:06:34.842249 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:34 crc kubenswrapper[4585]: E0215 17:06:34.842471 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:34 crc kubenswrapper[4585]: E0215 17:06:34.842635 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.843803 4585 scope.go:117] "RemoveContainer" containerID="8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.866244 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.866302 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.866320 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.866349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.866370 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.867987 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:34Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.902487 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:19Z\\\",\\\"message\\\":\\\": *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0215 17:06:19.179582 5957 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wvfh6 in node crc\\\\nI0215 17:06:19.180304 5957 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0215 17:06:19.180307 5957 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-wvfh6 after 0 failed attempt(s)\\\\nI0215 17:06:19.180313 5957 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-wvfh6\\\\nI0215 17:06:19.180314 5957 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0215 17:06:19.180210 5957 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:34Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.928852 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:34Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.953218 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:34Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.969192 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.969248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.969264 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.969288 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.969304 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:34Z","lastTransitionTime":"2026-02-15T17:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.975497 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:34Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:34 crc kubenswrapper[4585]: I0215 17:06:34.991905 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:34Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.009237 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.026816 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 03:48:08.49621459 +0000 UTC Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.045498 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.061516 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.072322 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.072360 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.072377 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.072404 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.072422 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:35Z","lastTransitionTime":"2026-02-15T17:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.075949 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.097978 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.111808 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.125719 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.140895 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.152155 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.159349 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/1.log" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.161856 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3"} Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.162469 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.162899 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.175072 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.175116 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.175129 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.175147 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.175160 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:35Z","lastTransitionTime":"2026-02-15T17:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.178008 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.191252 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.214028 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:19Z\\\",\\\"message\\\":\\\": *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0215 17:06:19.179582 5957 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wvfh6 in node crc\\\\nI0215 17:06:19.180304 5957 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0215 17:06:19.180307 5957 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-wvfh6 after 0 failed attempt(s)\\\\nI0215 17:06:19.180313 5957 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-wvfh6\\\\nI0215 17:06:19.180314 5957 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0215 17:06:19.180210 5957 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.230978 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.242356 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.255821 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.268014 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.278138 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.278188 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.278202 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.278224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.278240 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:35Z","lastTransitionTime":"2026-02-15T17:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.288465 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.301767 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.318106 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.348120 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.360457 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.372799 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.381112 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.381166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.381181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.381205 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.381219 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:35Z","lastTransitionTime":"2026-02-15T17:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.384215 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.398563 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.411081 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.422175 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.434257 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:35Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.484211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.484258 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.484267 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.484286 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.484298 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:35Z","lastTransitionTime":"2026-02-15T17:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.632857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.632930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.632948 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.632979 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.632997 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:35Z","lastTransitionTime":"2026-02-15T17:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.735657 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.735715 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.735734 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.735760 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.735778 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:35Z","lastTransitionTime":"2026-02-15T17:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.839820 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.839860 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.839873 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.839892 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.839906 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:35Z","lastTransitionTime":"2026-02-15T17:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.943224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.943282 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.943301 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.943327 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:35 crc kubenswrapper[4585]: I0215 17:06:35.943344 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:35Z","lastTransitionTime":"2026-02-15T17:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.027529 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 13:26:30.219846104 +0000 UTC Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.046322 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.046383 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.046401 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.046429 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.046447 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.149666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.149762 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.149786 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.149820 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.149843 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.168488 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/2.log" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.169643 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/1.log" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.173805 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3" exitCode=1 Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.173851 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.173903 4585 scope.go:117] "RemoveContainer" containerID="8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.174826 4585 scope.go:117] "RemoveContainer" containerID="a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3" Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.175023 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.200337 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.217592 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.253811 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.253860 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.253877 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.253900 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.253915 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.263955 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.288380 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.308655 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.334537 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.350907 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.356559 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.356617 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.356630 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.356653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.356668 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.364934 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.374775 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.381406 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.381612 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:07:08.381570086 +0000 UTC m=+84.324978218 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.381657 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.381771 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.381832 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:07:08.381816932 +0000 UTC m=+84.325225064 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.389292 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.400870 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.412319 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.429994 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a700d477e06547a920e23a052041e5557890aaa4be592b0dce4124af4727359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:19Z\\\",\\\"message\\\":\\\": *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0215 17:06:19.179582 5957 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-wvfh6 in node crc\\\\nI0215 17:06:19.180304 5957 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0215 17:06:19.180307 5957 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-wvfh6 after 0 failed attempt(s)\\\\nI0215 17:06:19.180313 5957 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-wvfh6\\\\nI0215 17:06:19.180314 5957 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI0215 17:06:19.180210 5957 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:35Z\\\",\\\"message\\\":\\\":false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.176],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0215 17:06:35.737165 6159 lb_config.go:1031] Cluster endpoints for openshift-ingress/router-internal-default for network=default are: map[]\\\\nI0215 17:06:35.736970 6159 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0215 17:06:35.737192 6159 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 2.392199ms\\\\nF0215 17:06:35.736593 6159 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.442400 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.454321 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.458862 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.458917 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.458927 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.458948 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.458960 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.465732 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.477159 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:36Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.482889 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.483000 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.483055 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483236 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483293 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483310 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483395 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-15 17:07:08.483368797 +0000 UTC m=+84.426776929 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483249 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483456 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483486 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483247 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483632 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-15 17:07:08.483566072 +0000 UTC m=+84.426974244 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.483681 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:07:08.483664244 +0000 UTC m=+84.427072416 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.562347 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.562397 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.562406 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.562426 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.562439 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.665842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.665897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.665914 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.665942 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.665991 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.685487 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.685719 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.685800 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs podName:ee2e2535-c7ad-42e7-930b-8e0471dfca11 nodeName:}" failed. No retries permitted until 2026-02-15 17:06:52.685779957 +0000 UTC m=+68.629188089 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs") pod "network-metrics-daemon-gclkf" (UID: "ee2e2535-c7ad-42e7-930b-8e0471dfca11") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.769058 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.769097 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.769107 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.769123 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.769134 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.841876 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.841984 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.842150 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.842203 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.842404 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.841881 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.842629 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:36 crc kubenswrapper[4585]: E0215 17:06:36.842761 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.871798 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.871867 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.871886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.871920 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.871949 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.975828 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.975897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.975920 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.975956 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:36 crc kubenswrapper[4585]: I0215 17:06:36.975980 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:36Z","lastTransitionTime":"2026-02-15T17:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.028073 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 19:38:25.293575303 +0000 UTC Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.078777 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.078844 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.078865 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.078897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.078918 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:37Z","lastTransitionTime":"2026-02-15T17:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.181581 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.181693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.181712 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.181742 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.181761 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:37Z","lastTransitionTime":"2026-02-15T17:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.182064 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/2.log" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.188450 4585 scope.go:117] "RemoveContainer" containerID="a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3" Feb 15 17:06:37 crc kubenswrapper[4585]: E0215 17:06:37.188859 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.210755 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.231939 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.253439 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.276449 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.285353 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.285543 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.285569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.285642 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.285663 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:37Z","lastTransitionTime":"2026-02-15T17:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.291237 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.299376 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.314317 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.322078 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.346172 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.365946 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.383782 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.389135 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.389180 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.389198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.389224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.389458 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:37Z","lastTransitionTime":"2026-02-15T17:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.416857 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.435919 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.454960 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.469052 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.493041 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.493488 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.493569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.493693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.493770 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:37Z","lastTransitionTime":"2026-02-15T17:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.493843 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.514445 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.534848 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.563578 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:35Z\\\",\\\"message\\\":\\\":false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.176],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0215 17:06:35.737165 6159 lb_config.go:1031] Cluster endpoints for openshift-ingress/router-internal-default for network=default are: map[]\\\\nI0215 17:06:35.736970 6159 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0215 17:06:35.737192 6159 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 2.392199ms\\\\nF0215 17:06:35.736593 6159 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.580733 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.596377 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.596572 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.596695 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.596781 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.596880 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:37Z","lastTransitionTime":"2026-02-15T17:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.604138 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:35Z\\\",\\\"message\\\":\\\":false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.176],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0215 17:06:35.737165 6159 lb_config.go:1031] Cluster endpoints for openshift-ingress/router-internal-default for network=default are: map[]\\\\nI0215 17:06:35.736970 6159 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0215 17:06:35.737192 6159 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 2.392199ms\\\\nF0215 17:06:35.736593 6159 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.621294 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.639524 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5627210-9489-4127-a42d-82ac2bee2c4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.659318 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.676883 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.693586 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.700555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.700661 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.700687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.700724 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.700751 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:37Z","lastTransitionTime":"2026-02-15T17:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.729674 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.750076 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.769148 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.792396 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.803482 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.803543 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.803562 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.803813 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.803835 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:37Z","lastTransitionTime":"2026-02-15T17:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.812511 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.827689 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.851153 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.871774 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.890986 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.906206 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.907215 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.907348 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.907441 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.907541 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.907660 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:37Z","lastTransitionTime":"2026-02-15T17:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:37 crc kubenswrapper[4585]: I0215 17:06:37.930984 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:37Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.010810 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.010894 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.010913 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.010946 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.010970 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.028224 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 23:24:02.970731915 +0000 UTC Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.114849 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.114933 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.114964 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.114997 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.115018 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.217653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.217750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.217775 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.217816 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.217844 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.321372 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.321431 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.321442 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.321457 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.321467 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.425174 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.425654 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.425762 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.425860 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.425942 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.533094 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.533203 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.533221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.533247 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.533265 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.636945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.637019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.637035 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.637064 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.637083 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.740766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.740826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.740843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.740869 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.740888 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.841962 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.842048 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.841989 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.841982 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:38 crc kubenswrapper[4585]: E0215 17:06:38.842196 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:38 crc kubenswrapper[4585]: E0215 17:06:38.842319 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:38 crc kubenswrapper[4585]: E0215 17:06:38.842435 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:38 crc kubenswrapper[4585]: E0215 17:06:38.843740 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.844306 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.844344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.844360 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.844383 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.844403 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.947738 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.947800 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.947820 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.947844 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:38 crc kubenswrapper[4585]: I0215 17:06:38.947862 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:38Z","lastTransitionTime":"2026-02-15T17:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.028943 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 10:06:23.691679935 +0000 UTC Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.051193 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.051258 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.051276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.051304 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.051322 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.154352 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.154409 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.154428 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.154455 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.154477 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.258443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.258515 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.258533 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.258659 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.258704 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.361859 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.361935 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.361957 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.361988 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.362010 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.465172 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.465255 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.465295 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.465335 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.465359 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.569068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.569139 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.569157 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.569187 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.569210 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.672234 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.672713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.672859 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.673008 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.673130 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.776999 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.777084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.777097 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.777121 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.777135 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.880379 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.880468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.880495 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.880532 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.880557 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.984433 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.984827 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.985000 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.985156 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:39 crc kubenswrapper[4585]: I0215 17:06:39.985290 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:39Z","lastTransitionTime":"2026-02-15T17:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.030009 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 19:12:50.098026226 +0000 UTC Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.088533 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.088665 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.088684 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.088714 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.088732 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:40Z","lastTransitionTime":"2026-02-15T17:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.192465 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.192796 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.192922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.193028 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.193127 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:40Z","lastTransitionTime":"2026-02-15T17:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.296973 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.297269 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.297330 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.297424 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.297490 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:40Z","lastTransitionTime":"2026-02-15T17:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.401266 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.401340 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.401363 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.401396 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.401423 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:40Z","lastTransitionTime":"2026-02-15T17:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.505043 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.505113 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.505134 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.505160 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.505178 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:40Z","lastTransitionTime":"2026-02-15T17:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.607655 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.607706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.607719 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.607747 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.607762 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:40Z","lastTransitionTime":"2026-02-15T17:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.710463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.710534 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.710551 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.710581 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.710629 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:40Z","lastTransitionTime":"2026-02-15T17:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.813472 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.813572 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.813584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.813615 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.813627 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:40Z","lastTransitionTime":"2026-02-15T17:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.840800 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:40 crc kubenswrapper[4585]: E0215 17:06:40.841208 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.841574 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:40 crc kubenswrapper[4585]: E0215 17:06:40.841755 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.841974 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:40 crc kubenswrapper[4585]: E0215 17:06:40.842074 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.842425 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:40 crc kubenswrapper[4585]: E0215 17:06:40.842654 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.923660 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.923750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.923772 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.923804 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:40 crc kubenswrapper[4585]: I0215 17:06:40.923825 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:40Z","lastTransitionTime":"2026-02-15T17:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.028536 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.029143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.029376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.029538 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.029707 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.030576 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 19:53:23.473498788 +0000 UTC Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.132905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.132953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.132964 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.132984 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.132996 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.237100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.237545 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.237746 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.237940 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.238118 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.341406 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.341501 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.341523 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.341561 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.341586 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.444741 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.445109 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.445239 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.445390 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.445722 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.549360 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.549437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.549455 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.549484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.549507 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.653002 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.653064 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.653077 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.653105 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.653120 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.755946 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.756007 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.756025 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.756053 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.756074 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.773769 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.773834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.773852 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.773878 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.773901 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: E0215 17:06:41.795714 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:41Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.801751 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.801826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.801850 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.801884 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.801904 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: E0215 17:06:41.822873 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:41Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.828416 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.828492 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.828516 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.828549 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.828572 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: E0215 17:06:41.850999 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:41Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.856481 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.856553 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.856571 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.856631 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.856654 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: E0215 17:06:41.876301 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:41Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.881435 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.881507 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.881537 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.881571 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.881636 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:41 crc kubenswrapper[4585]: E0215 17:06:41.901796 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:41Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:41 crc kubenswrapper[4585]: E0215 17:06:41.902030 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.905090 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.905154 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.905167 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.905198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:41 crc kubenswrapper[4585]: I0215 17:06:41.905213 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:41Z","lastTransitionTime":"2026-02-15T17:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.007775 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.007839 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.007848 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.007866 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.007878 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.031287 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 13:59:33.04446489 +0000 UTC Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.112085 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.112174 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.112196 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.112253 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.112273 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.215187 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.215269 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.215291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.215325 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.215355 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.318766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.318835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.318852 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.318879 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.318897 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.422756 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.422821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.422839 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.422868 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.422899 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.526867 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.526942 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.526960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.527064 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.527089 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.630955 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.631032 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.631051 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.631080 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.631099 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.734711 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.734785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.734801 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.734832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.734851 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.838589 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.841147 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.841215 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.841259 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.841383 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.844836 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.844850 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:42 crc kubenswrapper[4585]: E0215 17:06:42.845262 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.845529 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.845650 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:42 crc kubenswrapper[4585]: E0215 17:06:42.845870 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:42 crc kubenswrapper[4585]: E0215 17:06:42.846416 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:42 crc kubenswrapper[4585]: E0215 17:06:42.847233 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.945727 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.945827 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.945846 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.945873 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:42 crc kubenswrapper[4585]: I0215 17:06:42.945892 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:42Z","lastTransitionTime":"2026-02-15T17:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.032160 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 17:11:04.125079095 +0000 UTC Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.055206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.055275 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.055294 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.055324 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.055344 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.158884 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.158950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.158971 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.159000 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.159019 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.262985 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.263058 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.263076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.263107 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.263127 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.367206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.367269 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.367289 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.367319 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.367344 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.470845 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.470896 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.470914 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.470991 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.471014 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.574422 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.574493 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.574510 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.574539 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.574561 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.677891 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.678011 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.678034 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.678061 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.678080 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.781842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.781928 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.781953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.782064 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.782147 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.886365 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.886450 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.886476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.886510 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.886533 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.990571 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.990673 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.990693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.990723 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:43 crc kubenswrapper[4585]: I0215 17:06:43.990740 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:43Z","lastTransitionTime":"2026-02-15T17:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.033081 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 19:59:11.265963807 +0000 UTC Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.094897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.094981 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.094998 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.095029 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.095047 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:44Z","lastTransitionTime":"2026-02-15T17:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.198749 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.198821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.198840 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.198867 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.198884 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:44Z","lastTransitionTime":"2026-02-15T17:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.302529 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.302595 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.302649 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.302684 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.302708 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:44Z","lastTransitionTime":"2026-02-15T17:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.406367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.406449 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.406469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.406497 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.406516 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:44Z","lastTransitionTime":"2026-02-15T17:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.509734 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.509813 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.509843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.509878 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.509898 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:44Z","lastTransitionTime":"2026-02-15T17:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.613221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.613279 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.613294 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.613321 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.613371 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:44Z","lastTransitionTime":"2026-02-15T17:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.722387 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.722455 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.722466 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.722483 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.722493 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:44Z","lastTransitionTime":"2026-02-15T17:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.828097 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.828855 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.828898 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.828941 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.828984 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:44Z","lastTransitionTime":"2026-02-15T17:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.840738 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.841047 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.841115 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.841193 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:44 crc kubenswrapper[4585]: E0215 17:06:44.841326 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:44 crc kubenswrapper[4585]: E0215 17:06:44.841391 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:44 crc kubenswrapper[4585]: E0215 17:06:44.841251 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:44 crc kubenswrapper[4585]: E0215 17:06:44.841666 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.863456 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:44Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.879413 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:44Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.892226 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:44Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.928615 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:44Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.934395 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.934435 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.934451 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.934475 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.934492 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:44Z","lastTransitionTime":"2026-02-15T17:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.947056 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:44Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.962122 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:44Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.977080 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:44Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:44 crc kubenswrapper[4585]: I0215 17:06:44.995689 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:44Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.012561 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.030590 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.033651 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 21:15:38.275949955 +0000 UTC Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.037736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.037776 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.037787 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.037807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.037820 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.056093 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.074364 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.103837 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:35Z\\\",\\\"message\\\":\\\":false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.176],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0215 17:06:35.737165 6159 lb_config.go:1031] Cluster endpoints for openshift-ingress/router-internal-default for network=default are: map[]\\\\nI0215 17:06:35.736970 6159 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0215 17:06:35.737192 6159 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 2.392199ms\\\\nF0215 17:06:35.736593 6159 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.122257 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.140992 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.141048 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.141069 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.141094 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.141113 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.143989 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.167913 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.189434 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5627210-9489-4127-a42d-82ac2bee2c4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.208999 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:45Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.244189 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.244251 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.244263 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.244282 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.244295 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.347820 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.347880 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.347894 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.347919 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.347936 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.450779 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.450825 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.450835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.450852 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.450861 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.554431 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.554495 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.554513 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.554542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.554563 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.658467 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.658542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.658562 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.658587 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.658703 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.764289 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.764354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.764369 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.764391 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.764406 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.868478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.868553 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.868576 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.868644 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.868666 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.971404 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.971454 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.971470 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.971491 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:45 crc kubenswrapper[4585]: I0215 17:06:45.971506 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:45Z","lastTransitionTime":"2026-02-15T17:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.034296 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 14:57:08.035893772 +0000 UTC Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.075425 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.075515 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.075540 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.075651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.075719 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:46Z","lastTransitionTime":"2026-02-15T17:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.183723 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.183801 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.183827 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.183857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.183874 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:46Z","lastTransitionTime":"2026-02-15T17:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.288197 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.288629 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.288847 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.289031 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.289173 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:46Z","lastTransitionTime":"2026-02-15T17:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.392539 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.392644 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.392673 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.392704 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.392725 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:46Z","lastTransitionTime":"2026-02-15T17:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.496089 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.496137 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.496154 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.496179 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.496195 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:46Z","lastTransitionTime":"2026-02-15T17:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.601584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.601676 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.601694 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.601717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.601734 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:46Z","lastTransitionTime":"2026-02-15T17:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.704961 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.705004 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.705022 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.705044 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.705062 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:46Z","lastTransitionTime":"2026-02-15T17:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.807537 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.807637 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.807662 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.807687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.807706 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:46Z","lastTransitionTime":"2026-02-15T17:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.842754 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:46 crc kubenswrapper[4585]: E0215 17:06:46.842897 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.842980 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.843050 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:46 crc kubenswrapper[4585]: E0215 17:06:46.843223 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.843511 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:46 crc kubenswrapper[4585]: E0215 17:06:46.843678 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:46 crc kubenswrapper[4585]: E0215 17:06:46.843839 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.910478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.910508 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.910516 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.910528 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:46 crc kubenswrapper[4585]: I0215 17:06:46.910538 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:46Z","lastTransitionTime":"2026-02-15T17:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.013344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.013384 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.013392 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.013405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.013418 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.035411 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 00:34:00.316046604 +0000 UTC Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.116441 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.116493 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.116511 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.116535 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.116550 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.221023 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.221090 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.221115 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.221142 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.221163 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.324651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.324708 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.324724 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.324747 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.324768 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.431065 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.431117 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.431134 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.431162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.431179 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.534171 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.534219 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.534238 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.534266 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.534288 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.654121 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.654191 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.654208 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.654234 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.654259 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.764997 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.766048 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.766106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.766136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.766241 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.869748 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.869805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.869821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.869844 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.869861 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.973128 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.973182 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.973199 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.973224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:47 crc kubenswrapper[4585]: I0215 17:06:47.973240 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:47Z","lastTransitionTime":"2026-02-15T17:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.035942 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 20:58:26.62364868 +0000 UTC Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.076923 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.076970 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.076991 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.077013 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.077031 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:48Z","lastTransitionTime":"2026-02-15T17:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.180633 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.180673 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.180683 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.180697 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.180706 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:48Z","lastTransitionTime":"2026-02-15T17:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.283239 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.283292 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.283314 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.283345 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.283366 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:48Z","lastTransitionTime":"2026-02-15T17:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.386682 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.386718 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.386729 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.386747 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.386762 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:48Z","lastTransitionTime":"2026-02-15T17:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.490024 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.490070 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.490081 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.490100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.490112 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:48Z","lastTransitionTime":"2026-02-15T17:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.592442 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.592519 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.592545 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.592574 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.592646 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:48Z","lastTransitionTime":"2026-02-15T17:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.695562 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.695821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.695838 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.695857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.695872 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:48Z","lastTransitionTime":"2026-02-15T17:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.797672 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.797731 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.797754 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.797781 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.797807 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:48Z","lastTransitionTime":"2026-02-15T17:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.841151 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:48 crc kubenswrapper[4585]: E0215 17:06:48.841380 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.842568 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:48 crc kubenswrapper[4585]: E0215 17:06:48.842720 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.842990 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:48 crc kubenswrapper[4585]: E0215 17:06:48.843091 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.843931 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:48 crc kubenswrapper[4585]: E0215 17:06:48.844032 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.844455 4585 scope.go:117] "RemoveContainer" containerID="a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3" Feb 15 17:06:48 crc kubenswrapper[4585]: E0215 17:06:48.844747 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.900779 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.900856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.900880 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.900909 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:48 crc kubenswrapper[4585]: I0215 17:06:48.900931 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:48Z","lastTransitionTime":"2026-02-15T17:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.003585 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.003645 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.003656 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.003672 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.003684 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.036265 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 05:03:34.878426805 +0000 UTC Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.106295 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.106363 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.106405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.106435 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.106457 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.209463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.209503 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.209517 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.209534 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.209545 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.312452 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.312515 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.312525 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.312541 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.312554 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.414471 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.414537 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.414555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.414583 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.414633 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.517730 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.517800 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.517824 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.517854 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.517876 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.621577 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.621648 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.621661 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.621684 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.621703 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.724960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.725008 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.725019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.725037 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.725048 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.828116 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.828199 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.828212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.828227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.828235 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.931298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.931376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.931395 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.931422 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:49 crc kubenswrapper[4585]: I0215 17:06:49.931442 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:49Z","lastTransitionTime":"2026-02-15T17:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.038252 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 00:54:15.552172849 +0000 UTC Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.043087 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.043727 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.043832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.043911 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.043937 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.148168 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.148224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.148233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.148253 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.148264 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.250271 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.250327 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.250343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.250367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.250384 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.354222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.354289 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.354306 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.354333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.354351 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.456717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.456765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.456777 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.456794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.456807 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.559448 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.559494 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.559507 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.559525 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.559538 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.662065 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.662100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.662110 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.662122 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.662132 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.765466 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.765523 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.765539 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.765564 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.765581 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.840854 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.840923 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.840946 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.841033 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:50 crc kubenswrapper[4585]: E0215 17:06:50.841117 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:50 crc kubenswrapper[4585]: E0215 17:06:50.841208 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:50 crc kubenswrapper[4585]: E0215 17:06:50.841351 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:50 crc kubenswrapper[4585]: E0215 17:06:50.841454 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.868074 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.868113 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.868124 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.868140 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.868152 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.970762 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.970800 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.970810 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.970825 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:50 crc kubenswrapper[4585]: I0215 17:06:50.970835 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:50Z","lastTransitionTime":"2026-02-15T17:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.038566 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 11:03:25.916442805 +0000 UTC Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.072876 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.072914 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.072924 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.072938 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.072949 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.175138 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.175176 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.175197 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.175213 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.175225 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.278435 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.278480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.278492 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.278510 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.278522 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.380834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.380874 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.380883 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.380897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.380906 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.483400 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.483469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.483491 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.483520 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.483538 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.587209 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.587262 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.587273 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.587291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.587301 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.690685 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.690737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.690746 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.690761 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.690770 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.793772 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.793835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.793854 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.793879 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.793896 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.896769 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.896822 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.896835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.896852 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.896863 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.999579 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.999620 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.999638 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.999651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:51 crc kubenswrapper[4585]: I0215 17:06:51.999660 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:51Z","lastTransitionTime":"2026-02-15T17:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.039420 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 19:56:14.870796921 +0000 UTC Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.101727 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.101760 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.101770 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.101787 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.101798 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.204743 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.204783 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.204792 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.204807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.204816 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.218149 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.218193 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.218206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.218226 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.218239 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.243301 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:52Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.246815 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.246860 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.246870 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.246883 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.246893 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.259977 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:52Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.262982 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.263017 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.263028 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.263056 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.263070 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.275699 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:52Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.279943 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.279966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.279974 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.280003 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.280013 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.292302 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:52Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.296109 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.296142 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.296151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.296166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.296176 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.308587 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:52Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.308749 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.311507 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.311538 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.311547 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.311558 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.311565 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.413748 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.413788 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.413799 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.413815 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.413829 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.516303 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.516604 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.516667 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.516955 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.517025 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.619764 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.619822 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.619840 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.619869 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.619888 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.722518 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.722591 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.722648 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.722676 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.722697 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.782782 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.782970 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.783057 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs podName:ee2e2535-c7ad-42e7-930b-8e0471dfca11 nodeName:}" failed. No retries permitted until 2026-02-15 17:07:24.783032685 +0000 UTC m=+100.726440857 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs") pod "network-metrics-daemon-gclkf" (UID: "ee2e2535-c7ad-42e7-930b-8e0471dfca11") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.825180 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.825223 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.825231 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.825244 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.825254 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.841528 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.841760 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.841911 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.842052 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.842107 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.842142 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.842368 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:52 crc kubenswrapper[4585]: E0215 17:06:52.842283 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.928420 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.928456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.928465 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.928512 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:52 crc kubenswrapper[4585]: I0215 17:06:52.928525 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:52Z","lastTransitionTime":"2026-02-15T17:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.030814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.030872 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.030882 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.030900 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.030908 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.040186 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 00:43:56.65466779 +0000 UTC Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.133666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.133694 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.133702 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.133714 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.133723 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.236086 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.236115 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.236124 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.236136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.236144 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.252180 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/0.log" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.252232 4585 generic.go:334] "Generic (PLEG): container finished" podID="70645395-8d49-4495-a647-b6d43206ecbc" containerID="e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156" exitCode=1 Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.252279 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4ps2" event={"ID":"70645395-8d49-4495-a647-b6d43206ecbc","Type":"ContainerDied","Data":"e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.252656 4585 scope.go:117] "RemoveContainer" containerID="e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.305967 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.339427 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.339469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.339480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.339497 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.339509 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.358475 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:35Z\\\",\\\"message\\\":\\\":false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.176],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0215 17:06:35.737165 6159 lb_config.go:1031] Cluster endpoints for openshift-ingress/router-internal-default for network=default are: map[]\\\\nI0215 17:06:35.736970 6159 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0215 17:06:35.737192 6159 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 2.392199ms\\\\nF0215 17:06:35.736593 6159 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.374210 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.388553 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.402110 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5627210-9489-4127-a42d-82ac2bee2c4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.413198 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.424459 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.436936 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.441756 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.441792 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.441801 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.441816 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.441826 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.448151 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.466567 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.478777 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.488377 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.499352 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"2026-02-15T17:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1\\\\n2026-02-15T17:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1 to /host/opt/cni/bin/\\\\n2026-02-15T17:06:07Z [verbose] multus-daemon started\\\\n2026-02-15T17:06:07Z [verbose] Readiness Indicator file check\\\\n2026-02-15T17:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.517656 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.531870 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.543632 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.544538 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.544567 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.544576 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.544589 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.544617 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.568307 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.580130 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:53Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.647625 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.647669 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.647700 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.647722 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.647739 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.751150 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.751194 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.751202 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.751218 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.751228 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.853450 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.853487 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.853497 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.853512 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.853522 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.955571 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.955661 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.955680 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.955706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:53 crc kubenswrapper[4585]: I0215 17:06:53.955722 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:53Z","lastTransitionTime":"2026-02-15T17:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.040476 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 20:17:31.411588665 +0000 UTC Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.058021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.058049 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.058057 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.058070 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.058080 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.161692 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.161729 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.161740 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.161754 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.161763 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.258222 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/0.log" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.258275 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4ps2" event={"ID":"70645395-8d49-4495-a647-b6d43206ecbc","Type":"ContainerStarted","Data":"8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.263864 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.263890 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.263897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.263908 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.263917 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.280173 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.298591 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.312385 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.325735 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.352704 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.367184 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.367212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.367221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.367242 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.367252 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.371669 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.412019 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:35Z\\\",\\\"message\\\":\\\":false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.176],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0215 17:06:35.737165 6159 lb_config.go:1031] Cluster endpoints for openshift-ingress/router-internal-default for network=default are: map[]\\\\nI0215 17:06:35.736970 6159 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0215 17:06:35.737192 6159 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 2.392199ms\\\\nF0215 17:06:35.736593 6159 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.426763 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.438711 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5627210-9489-4127-a42d-82ac2bee2c4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.450938 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.464248 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.469842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.469883 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.469891 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.469905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.469914 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.479048 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.493564 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.519164 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.532509 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.550099 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.564536 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"2026-02-15T17:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1\\\\n2026-02-15T17:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1 to /host/opt/cni/bin/\\\\n2026-02-15T17:06:07Z [verbose] multus-daemon started\\\\n2026-02-15T17:06:07Z [verbose] Readiness Indicator file check\\\\n2026-02-15T17:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.572959 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.572991 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.573001 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.573041 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.573052 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.576175 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.676500 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.676538 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.676551 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.676567 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.676579 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.780883 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.780908 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.780917 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.780930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.780940 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.842813 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.842984 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.843070 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.843160 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:54 crc kubenswrapper[4585]: E0215 17:06:54.843754 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:54 crc kubenswrapper[4585]: E0215 17:06:54.843924 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:54 crc kubenswrapper[4585]: E0215 17:06:54.843975 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:54 crc kubenswrapper[4585]: E0215 17:06:54.844015 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.876150 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.888745 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.888806 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.888827 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.888854 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.888875 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.893576 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.910468 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.924137 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"2026-02-15T17:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1\\\\n2026-02-15T17:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1 to /host/opt/cni/bin/\\\\n2026-02-15T17:06:07Z [verbose] multus-daemon started\\\\n2026-02-15T17:06:07Z [verbose] Readiness Indicator file check\\\\n2026-02-15T17:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.937563 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.951665 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.968265 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.984768 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.991188 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.991218 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.991227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.991240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:54 crc kubenswrapper[4585]: I0215 17:06:54.991251 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:54Z","lastTransitionTime":"2026-02-15T17:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.001211 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:54Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.015200 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:55Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.029003 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:55Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.040875 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 21:15:07.421062844 +0000 UTC Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.043848 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:55Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.072662 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:35Z\\\",\\\"message\\\":\\\":false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.176],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0215 17:06:35.737165 6159 lb_config.go:1031] Cluster endpoints for openshift-ingress/router-internal-default for network=default are: map[]\\\\nI0215 17:06:35.736970 6159 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0215 17:06:35.737192 6159 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 2.392199ms\\\\nF0215 17:06:35.736593 6159 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:55Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.088657 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:55Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.093424 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.093457 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.093470 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.093490 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.093522 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:55Z","lastTransitionTime":"2026-02-15T17:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.101658 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5627210-9489-4127-a42d-82ac2bee2c4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:55Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.112727 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:55Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.126054 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:55Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.140085 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:06:55Z is after 2025-08-24T17:21:41Z" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.196963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.197018 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.197036 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.197060 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.197078 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:55Z","lastTransitionTime":"2026-02-15T17:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.309123 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.310676 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.310710 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.310734 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.310815 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:55Z","lastTransitionTime":"2026-02-15T17:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.414853 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.414902 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.414917 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.414941 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.414956 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:55Z","lastTransitionTime":"2026-02-15T17:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.517524 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.517575 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.517594 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.517651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.517668 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:55Z","lastTransitionTime":"2026-02-15T17:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.619479 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.619526 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.619535 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.619553 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.619563 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:55Z","lastTransitionTime":"2026-02-15T17:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.723148 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.723182 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.723191 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.723206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.723215 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:55Z","lastTransitionTime":"2026-02-15T17:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.824820 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.824857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.824865 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.824879 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.824889 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:55Z","lastTransitionTime":"2026-02-15T17:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.928870 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.928912 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.928920 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.928933 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:55 crc kubenswrapper[4585]: I0215 17:06:55.928945 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:55Z","lastTransitionTime":"2026-02-15T17:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.031784 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.031835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.031849 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.031868 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.031883 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.041375 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 13:53:53.763845491 +0000 UTC Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.134404 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.134444 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.134453 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.134469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.134480 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.238304 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.238359 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.238376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.238412 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.238429 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.341015 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.341059 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.341069 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.341083 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.341092 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.443231 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.443274 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.443285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.443299 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.443309 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.545331 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.545363 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.545371 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.545382 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.545390 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.648251 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.648281 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.648290 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.648303 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.648313 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.750857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.750913 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.750930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.750959 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.750977 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.840775 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.840822 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.840993 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.841002 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:56 crc kubenswrapper[4585]: E0215 17:06:56.841119 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:56 crc kubenswrapper[4585]: E0215 17:06:56.841280 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:56 crc kubenswrapper[4585]: E0215 17:06:56.841462 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:56 crc kubenswrapper[4585]: E0215 17:06:56.841580 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.853864 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.853952 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.853971 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.853993 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.854046 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.957497 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.957555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.957583 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.957625 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:56 crc kubenswrapper[4585]: I0215 17:06:56.957640 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:56Z","lastTransitionTime":"2026-02-15T17:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.042369 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 20:42:47.498982863 +0000 UTC Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.060706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.060775 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.060794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.060821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.060840 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.163525 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.163619 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.163634 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.163660 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.163692 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.267741 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.267805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.267829 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.267855 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.267872 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.372766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.372824 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.372843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.372921 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.372940 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.477092 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.477140 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.477151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.477171 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.477181 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.580349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.580393 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.580403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.580419 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.580427 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.683097 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.683162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.683191 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.683223 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.683240 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.786178 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.786221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.786232 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.786247 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.786257 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.888492 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.888526 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.888534 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.888550 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.888563 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.992224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.992277 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.992291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.992310 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:57 crc kubenswrapper[4585]: I0215 17:06:57.992324 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:57Z","lastTransitionTime":"2026-02-15T17:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.043309 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 14:04:03.569949049 +0000 UTC Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.095640 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.095728 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.095787 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.095833 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.095857 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:58Z","lastTransitionTime":"2026-02-15T17:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.199004 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.199065 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.199079 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.199098 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.199112 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:58Z","lastTransitionTime":"2026-02-15T17:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.303415 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.303467 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.303476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.303491 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.303500 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:58Z","lastTransitionTime":"2026-02-15T17:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.407148 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.407194 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.407206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.407223 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.407234 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:58Z","lastTransitionTime":"2026-02-15T17:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.511143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.511171 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.511182 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.511198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.511210 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:58Z","lastTransitionTime":"2026-02-15T17:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.613954 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.613983 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.613995 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.614010 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.614021 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:58Z","lastTransitionTime":"2026-02-15T17:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.716846 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.716895 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.716907 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.716926 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.716938 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:58Z","lastTransitionTime":"2026-02-15T17:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.819691 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.819776 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.819803 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.819835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.819858 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:58Z","lastTransitionTime":"2026-02-15T17:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.841387 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:06:58 crc kubenswrapper[4585]: E0215 17:06:58.841572 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.841691 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:06:58 crc kubenswrapper[4585]: E0215 17:06:58.841771 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.842150 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:06:58 crc kubenswrapper[4585]: E0215 17:06:58.842253 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.842537 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:06:58 crc kubenswrapper[4585]: E0215 17:06:58.842665 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.922875 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.923141 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.923220 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.923417 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:58 crc kubenswrapper[4585]: I0215 17:06:58.923487 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:58Z","lastTransitionTime":"2026-02-15T17:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.026911 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.027224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.027484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.027743 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.027948 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.044056 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 19:20:33.013454541 +0000 UTC Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.131400 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.131442 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.131459 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.131481 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.131498 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.234299 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.234348 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.234385 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.234407 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.234418 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.337828 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.337877 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.337893 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.337916 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.337934 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.442021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.442076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.442087 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.442104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.442114 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.545693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.545752 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.545770 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.545794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.545814 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.648895 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.649291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.649367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.649445 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.650169 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.753477 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.753538 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.753554 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.753578 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.753626 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.841763 4585 scope.go:117] "RemoveContainer" containerID="a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.856429 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.856461 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.856470 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.856485 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.856496 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.959800 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.959858 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.959879 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.959909 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:06:59 crc kubenswrapper[4585]: I0215 17:06:59.959933 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:06:59Z","lastTransitionTime":"2026-02-15T17:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.044878 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 03:08:25.377830671 +0000 UTC Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.063969 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.064026 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.064048 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.064077 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.064109 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.166427 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.166459 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.166469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.166484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.166496 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.269528 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.269572 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.269590 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.269644 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.269663 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.282973 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/2.log" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.286628 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.287183 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.311771 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.328477 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.351106 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.367795 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5627210-9489-4127-a42d-82ac2bee2c4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.371613 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.371651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.371663 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.371680 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.371691 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.382879 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.408946 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"2026-02-15T17:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1\\\\n2026-02-15T17:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1 to /host/opt/cni/bin/\\\\n2026-02-15T17:06:07Z [verbose] multus-daemon started\\\\n2026-02-15T17:06:07Z [verbose] Readiness Indicator file check\\\\n2026-02-15T17:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.430630 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.444150 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.464847 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.474127 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.474213 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.474226 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.474248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.474259 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.480215 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.491035 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.501568 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.516844 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.528690 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.541287 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.555118 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.569697 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.577152 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.577210 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.577227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.577251 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.577267 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.593283 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:35Z\\\",\\\"message\\\":\\\":false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.176],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0215 17:06:35.737165 6159 lb_config.go:1031] Cluster endpoints for openshift-ingress/router-internal-default for network=default are: map[]\\\\nI0215 17:06:35.736970 6159 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0215 17:06:35.737192 6159 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 2.392199ms\\\\nF0215 17:06:35.736593 6159 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:00Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.679629 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.679677 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.679693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.679717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.679732 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.782250 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.782346 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.782364 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.782388 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.782405 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.840914 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.840966 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.840928 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.840973 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:00 crc kubenswrapper[4585]: E0215 17:07:00.841158 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:00 crc kubenswrapper[4585]: E0215 17:07:00.841230 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:00 crc kubenswrapper[4585]: E0215 17:07:00.841402 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:00 crc kubenswrapper[4585]: E0215 17:07:00.841667 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.885784 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.886023 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.886092 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.886159 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.886475 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.989983 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.990050 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.990068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.990089 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:00 crc kubenswrapper[4585]: I0215 17:07:00.990101 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:00Z","lastTransitionTime":"2026-02-15T17:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.045997 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 07:35:15.381714262 +0000 UTC Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.093280 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.093368 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.093391 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.093420 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.093438 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:01Z","lastTransitionTime":"2026-02-15T17:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.196803 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.196866 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.196884 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.196909 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.196927 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:01Z","lastTransitionTime":"2026-02-15T17:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.292910 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/3.log" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.293866 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/2.log" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.297750 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" exitCode=1 Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.297840 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.297969 4585 scope.go:117] "RemoveContainer" containerID="a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.298926 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:07:01 crc kubenswrapper[4585]: E0215 17:07:01.299183 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.300455 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.300484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.300499 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.300520 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.300537 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:01Z","lastTransitionTime":"2026-02-15T17:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.340551 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.361683 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.378721 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.398592 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"2026-02-15T17:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1\\\\n2026-02-15T17:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1 to /host/opt/cni/bin/\\\\n2026-02-15T17:06:07Z [verbose] multus-daemon started\\\\n2026-02-15T17:06:07Z [verbose] Readiness Indicator file check\\\\n2026-02-15T17:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.403735 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.403789 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.403806 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.403832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.403848 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:01Z","lastTransitionTime":"2026-02-15T17:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.417635 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.435100 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.455394 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.474372 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.494120 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.506317 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.506347 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.506357 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.506372 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.506384 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:01Z","lastTransitionTime":"2026-02-15T17:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.507535 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.531808 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.550940 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.581820 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3bdbc886d32845ee6e9fa5a313b96d297c986ffa03722db55d1cbf30ae2bca3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:35Z\\\",\\\"message\\\":\\\":false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.176],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0215 17:06:35.737165 6159 lb_config.go:1031] Cluster endpoints for openshift-ingress/router-internal-default for network=default are: map[]\\\\nI0215 17:06:35.736970 6159 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0215 17:06:35.737192 6159 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 2.392199ms\\\\nF0215 17:06:35.736593 6159 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:07:00Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-4hptv in node crc\\\\nI0215 17:07:00.885229 6520 ovnkube.go:599] Stopped ovnkube\\\\nI0215 17:07:00.885872 6520 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0215 17:07:00.885725 6520 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0215 17:07:00.885894 6520 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0215 17:07:00.885909 6520 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nF0215 17:07:00.885918 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.599413 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.608706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.608750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.608765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.608785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.608799 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:01Z","lastTransitionTime":"2026-02-15T17:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.616779 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5627210-9489-4127-a42d-82ac2bee2c4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.638708 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.657868 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.675174 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:01Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.712044 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.712100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.712116 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.712140 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.712157 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:01Z","lastTransitionTime":"2026-02-15T17:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.814900 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.814949 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.814965 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.815011 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.815031 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:01Z","lastTransitionTime":"2026-02-15T17:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.918531 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.918673 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.918703 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.918727 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:01 crc kubenswrapper[4585]: I0215 17:07:01.918743 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:01Z","lastTransitionTime":"2026-02-15T17:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.021908 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.021969 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.021992 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.022019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.022039 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.047023 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 12:08:48.858507731 +0000 UTC Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.124623 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.124713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.124734 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.124794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.124814 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.228111 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.228186 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.228209 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.228328 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.228358 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.304714 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/3.log" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.310944 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.311202 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.326975 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.331029 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.331086 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.331104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.331131 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.331150 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.350767 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"2026-02-15T17:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1\\\\n2026-02-15T17:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1 to /host/opt/cni/bin/\\\\n2026-02-15T17:06:07Z [verbose] multus-daemon started\\\\n2026-02-15T17:06:07Z [verbose] Readiness Indicator file check\\\\n2026-02-15T17:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.371540 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.385904 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.409522 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.428178 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.433458 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.433519 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.433541 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.433572 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.433591 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.448864 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.465941 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.488170 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.509190 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.523628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.523665 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.523678 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.523695 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.523706 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.530332 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.542315 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.547176 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.547219 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.547237 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.547260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.547277 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.564999 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:07:00Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-4hptv in node crc\\\\nI0215 17:07:00.885229 6520 ovnkube.go:599] Stopped ovnkube\\\\nI0215 17:07:00.885872 6520 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0215 17:07:00.885725 6520 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0215 17:07:00.885894 6520 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0215 17:07:00.885909 6520 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nF0215 17:07:00.885918 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.568370 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.572989 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.573075 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.573126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.573151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.573167 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.587672 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.594364 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.600010 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.600083 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.600106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.600136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.600161 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.609147 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.622844 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.626725 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.628794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.628825 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.628845 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.628862 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.628873 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.650105 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.654429 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-15T17:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ae821cdc-4077-4a14-ad50-91dcc5071f65\\\",\\\"systemUUID\\\":\\\"8fecb70f-9a43-454c-bc0f-3400703ceb5f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.654781 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.657170 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.657230 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.657248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.657273 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.657290 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.676236 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.694683 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5627210-9489-4127-a42d-82ac2bee2c4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:02Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.759853 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.759913 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.759930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.759955 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.759975 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.841095 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.841200 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.841397 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.841413 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.841492 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.841716 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.841352 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:02 crc kubenswrapper[4585]: E0215 17:07:02.841911 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.862151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.862214 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.862234 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.862257 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.862274 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.966118 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.966195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.966212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.966246 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:02 crc kubenswrapper[4585]: I0215 17:07:02.966264 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:02Z","lastTransitionTime":"2026-02-15T17:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.047237 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 22:50:37.052093408 +0000 UTC Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.070302 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.070369 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.070385 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.070415 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.070434 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:03Z","lastTransitionTime":"2026-02-15T17:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.173304 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.173365 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.173383 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.173408 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.173427 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:03Z","lastTransitionTime":"2026-02-15T17:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.276513 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.276576 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.276593 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.276650 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.276671 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:03Z","lastTransitionTime":"2026-02-15T17:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.380931 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.380991 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.381007 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.381031 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.381049 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:03Z","lastTransitionTime":"2026-02-15T17:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.484326 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.484397 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.484417 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.484441 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.484458 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:03Z","lastTransitionTime":"2026-02-15T17:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.586856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.586912 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.586929 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.586952 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.586970 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:03Z","lastTransitionTime":"2026-02-15T17:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.690320 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.690376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.690392 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.690416 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.690434 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:03Z","lastTransitionTime":"2026-02-15T17:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.793542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.793684 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.793710 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.793740 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.793763 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:03Z","lastTransitionTime":"2026-02-15T17:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.858103 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.897236 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.897375 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.897395 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.897416 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:03 crc kubenswrapper[4585]: I0215 17:07:03.897469 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:03Z","lastTransitionTime":"2026-02-15T17:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.000324 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.000387 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.000404 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.000426 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.000442 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.048086 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 05:32:38.092070739 +0000 UTC Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.103427 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.103507 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.103524 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.103549 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.103565 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.206248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.206311 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.206331 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.206355 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.206372 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.309458 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.309523 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.309540 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.309565 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.309583 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.413149 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.413208 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.413225 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.413249 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.413267 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.516344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.516436 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.516456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.516476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.516487 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.619537 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.619579 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.619617 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.619641 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.619651 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.722732 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.722772 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.722783 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.722799 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.722813 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.825843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.825925 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.825950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.825987 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.826009 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.841634 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:04 crc kubenswrapper[4585]: E0215 17:07:04.841754 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.841796 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.841865 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.841910 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:04 crc kubenswrapper[4585]: E0215 17:07:04.841990 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:04 crc kubenswrapper[4585]: E0215 17:07:04.842180 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:04 crc kubenswrapper[4585]: E0215 17:07:04.842292 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.875432 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86f72c3c-eed8-4622-98ec-ab7a784305a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6eef36412946d7c001f810478c7a652dcc3d7d304253370b6c678d3e9803b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7deb4b60f167454bd913837a5031c61a5252693d8d424c7ce9428c95c4e51caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://126fb444f542ba65b204b96a49bc972517d8cbba640845bff3592dfc7150132f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a71ad87bba226aa8dd2efa062a29c44a74437ec332c0624665e511d50ec7feb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a98706bf9eb94c5c18f3da51d1bed7722c12e94fd62bde5c3e4a289ae8d691b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c43c3c4b247761ae5eb164a9470ea3ebee2dcd4c31ecde03f740f093d92f5f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55496bd5d086ccae968f0793e1fd4825763fe4dbd6729d2da9c4fc99f506b5f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5dece66ebe72382d73ccf605723fd1138c67e7679575fa4baf7ba24c23f678c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:04Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.898517 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48691b4d5d530955b2f2706c353f7af2d4a4c56a7af801df2623cecffd061719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:04Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.915133 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7vtnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41851c0a-0f98-4a53-b102-505ee4f6b1ea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2891d6efde929d4e5d048804c0810d0328992491d1a8d35841034695c6d8aacf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52s78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7vtnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:04Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.928396 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.928502 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.928575 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.928674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.928706 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:04Z","lastTransitionTime":"2026-02-15T17:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.937334 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n4ps2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70645395-8d49-4495-a647-b6d43206ecbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:06:52Z\\\",\\\"message\\\":\\\"2026-02-15T17:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1\\\\n2026-02-15T17:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16fb388a-f1ba-4d96-83ee-fd8490cf7cb1 to /host/opt/cni/bin/\\\\n2026-02-15T17:06:07Z [verbose] multus-daemon started\\\\n2026-02-15T17:06:07Z [verbose] Readiness Indicator file check\\\\n2026-02-15T17:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r966t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n4ps2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:04Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.955504 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b75f106e-0b6d-4a73-8b99-66ca018fec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47f86e44481694af3afcfbb2b9a1badd15a8067d692ab1f63492756db206079b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0c7d12d523dfd70f862e776f61eee6411dedc3d9032231e32a9675c494544b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rtg45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-klxmj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:04Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:04 crc kubenswrapper[4585]: I0215 17:07:04.975049 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gclkf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee2e2535-c7ad-42e7-930b-8e0471dfca11\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nl4lt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gclkf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:04Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.000172 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1bfceb1-2684-441c-95cc-575cfe9feb71\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4198d11c7cda5e5613d7080dbf87211053d91aa090ec353796b650265822996e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662207881f8f943111b1b87916a583375f7976559ecb4e1cff96a9dc57746f84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbcc2bfc2a62b239cb4ed94bb5251bcc405b1ae796d8b8dfe0b7305abbef9301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:04Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.020989 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66f299349b4aa1f1a5c77b75316a2693d23bdf269a82427dbeddc23a1926f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.030985 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.031019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.031029 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.031045 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.031057 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.039446 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ee5a27f15149ce1a4b1f74adc60eb12f3b16af782268ecbc4f214a080c3953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68e67e8b6f6c6647a9145c8949b2a67e32a4408277836ad49e8f7301f16531f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.048694 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 05:44:06.552157656 +0000 UTC Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.053567 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wvfh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff52deac-c128-47a3-b1fe-15ee558b62b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f21a2af0353e57da4523fb44390e3bcb5d1947ec6507b7949b597fffe65cc4cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9gqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wvfh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.075015 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ccfbdd4-85d7-4557-b87d-5a99a2ad8cb1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://878e4e3b6f7ec45ae99ade9ab3da53163ee295d048df7b924ae3d24f226f3d83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eda285c35e37c61b594f0ab8b8880c0c63e66aad047a657bffbcbd02f3face4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0381afc0b920d7c435d5095ed36a38c6717bf08fd1c1fbc457d7b8258ab372b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://651aa5924efed90d69b0d9428e7c8601ae721b91600c749c5e5ae6806c11ecaf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1a1cc5d605325bc5786724e14ee8ff8c02c62b51c063b49b245a8d1340b3dd0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7fac190a97d50eb207bf55c9fffc932e8fae72dd5a90243477154bf180172ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4681357f0d01860eb33b493c21cb77cdfbce8452b6fcd6de4dfcc7baef6730c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsp49\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bwj9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.089940 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"229faa8e-8a12-4a0d-94df-210f061b93ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4880fbe29ae3839309a0c788dbf02d73869757aa52899c34e89cc857e8f4c17d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84c4d470a126536489835fd48c7b32e1f0e14f2df635f678e4e3d3175666c265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84c4d470a126536489835fd48c7b32e1f0e14f2df635f678e4e3d3175666c265\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.107298 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.133412 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.133472 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.133489 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.133514 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.133532 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.133451 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5acdc04-0978-4907-bd9e-965400ded9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-15T17:07:00Z\\\",\\\"message\\\":\\\"od openshift-machine-config-operator/machine-config-daemon-4hptv in node crc\\\\nI0215 17:07:00.885229 6520 ovnkube.go:599] Stopped ovnkube\\\\nI0215 17:07:00.885872 6520 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0215 17:07:00.885725 6520 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0215 17:07:00.885894 6520 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0215 17:07:00.885909 6520 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nF0215 17:07:00.885918 6520 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h74xg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vp6tl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.153945 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e5550d-fe21-4438-9d5a-7d4169075c94\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0215 17:06:05.203447 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0215 17:06:05.203473 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203491 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0215 17:06:05.203556 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203567 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0215 17:06:05.203629 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0215 17:06:05.203662 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0215 17:06:05.203670 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0215 17:06:05.203671 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0215 17:06:05.203712 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0215 17:06:05.207265 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.203587 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2195464087/tls.crt::/tmp/serving-cert-2195464087/tls.key\\\\\\\"\\\\nI0215 17:06:05.207801 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0215 17:06:05.208100 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0215 17:06:05.212794 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.164290 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5627210-9489-4127-a42d-82ac2bee2c4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9de797580d82ad469bcae924a00f1de3ba136d5576b452b18d064552b67b8856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74bd12c50506b53ef2908272d6e7d8ffdb4d0f43b42a3d7798c749c1dd72cd73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f352827ed6046d18a31c5b8c596772cbc68e4b6d1e8d31b0b924a48a4d1ea8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc49032c836557e9855a632f80c2c768a39cd742d1469d6977673cbdeb5a3e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-15T17:05:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-15T17:05:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:05:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.177992 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.190035 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.200917 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c41aeb2-e722-4379-b7d6-fe499719f9d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-15T17:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb04a175a4f070c2df32974531593139e82ae5d0edf60f15b65b3a168d292311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-15T17:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcsmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-15T17:06:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hptv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-15T17:07:05Z is after 2025-08-24T17:21:41Z" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.236265 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.236328 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.236351 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.236379 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.236402 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.338542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.338572 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.338583 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.338629 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.338645 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.441151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.441184 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.441197 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.441212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.441223 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.543898 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.544136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.544148 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.544164 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.544175 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.646178 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.646405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.646468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.646540 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.646624 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.748628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.748888 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.748958 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.749048 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.749122 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.851852 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.851922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.851944 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.851973 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.851997 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.954922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.954974 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.954992 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.955017 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:05 crc kubenswrapper[4585]: I0215 17:07:05.955034 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:05Z","lastTransitionTime":"2026-02-15T17:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.049680 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 11:13:36.031169141 +0000 UTC Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.057584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.057645 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.057657 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.057675 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.057685 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.160224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.160289 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.160309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.160337 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.160358 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.262847 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.262911 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.262934 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.262964 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.262986 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.365541 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.365635 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.365659 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.365687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.365709 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.468871 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.468915 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.468930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.468953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.468970 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.571719 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.571771 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.571789 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.571811 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.571829 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.674217 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.674270 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.674288 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.674312 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.674328 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.777271 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.777309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.777321 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.777338 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.777350 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.841796 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.841846 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:06 crc kubenswrapper[4585]: E0215 17:07:06.841963 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.842029 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.842094 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:06 crc kubenswrapper[4585]: E0215 17:07:06.842123 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:06 crc kubenswrapper[4585]: E0215 17:07:06.842267 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:06 crc kubenswrapper[4585]: E0215 17:07:06.842368 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.880362 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.880399 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.880412 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.880428 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.880442 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.983130 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.983182 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.983206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.983285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:06 crc kubenswrapper[4585]: I0215 17:07:06.983307 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:06Z","lastTransitionTime":"2026-02-15T17:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.050590 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 21:59:37.38738502 +0000 UTC Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.086646 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.086734 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.086753 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.086777 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.086794 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:07Z","lastTransitionTime":"2026-02-15T17:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.189557 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.189649 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.189667 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.189690 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.189710 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:07Z","lastTransitionTime":"2026-02-15T17:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.292794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.292857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.292874 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.292898 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.292915 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:07Z","lastTransitionTime":"2026-02-15T17:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.396676 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.396727 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.396747 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.396771 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.396792 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:07Z","lastTransitionTime":"2026-02-15T17:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.500813 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.500889 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.500913 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.500940 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.500956 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:07Z","lastTransitionTime":"2026-02-15T17:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.604670 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.604733 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.604750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.604773 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.604790 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:07Z","lastTransitionTime":"2026-02-15T17:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.708135 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.708509 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.708748 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.708917 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.709064 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:07Z","lastTransitionTime":"2026-02-15T17:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.812819 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.813216 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.813376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.813518 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.813686 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:07Z","lastTransitionTime":"2026-02-15T17:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.916962 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.917021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.917043 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.917071 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:07 crc kubenswrapper[4585]: I0215 17:07:07.917088 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:07Z","lastTransitionTime":"2026-02-15T17:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.021067 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.021124 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.021147 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.021199 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.021224 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.051801 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 10:52:13.2066313 +0000 UTC Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.124137 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.124201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.124219 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.124244 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.124262 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.226804 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.226904 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.226933 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.226962 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.226986 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.329545 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.329579 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.329588 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.329630 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.329639 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.432919 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.432982 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.432999 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.433023 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.433039 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.452856 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.453009 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.452979788 +0000 UTC m=+148.396387960 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.453104 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.453284 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.453387 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.453362277 +0000 UTC m=+148.396770439 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.536456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.536515 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.536533 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.536555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.536574 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.555071 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.555150 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.555224 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.555388 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.555414 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.555433 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.555501 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.555473885 +0000 UTC m=+148.498882057 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.555837 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.555870 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.555891 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.555950 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.555930637 +0000 UTC m=+148.499338809 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.556174 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.556222 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.556208874 +0000 UTC m=+148.499617036 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.640130 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.640213 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.640265 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.640291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.640311 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.744240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.744341 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.744358 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.744788 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.744837 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.841398 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.841467 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.841399 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.841700 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.841774 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.841922 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.842003 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:08 crc kubenswrapper[4585]: E0215 17:07:08.842159 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.848278 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.848324 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.848342 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.848364 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.848380 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.952586 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.952664 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.952679 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.952704 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:08 crc kubenswrapper[4585]: I0215 17:07:08.952723 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:08Z","lastTransitionTime":"2026-02-15T17:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.052871 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 01:18:38.351041388 +0000 UTC Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.055845 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.055915 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.055936 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.055966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.055984 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.158359 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.158389 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.158399 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.158413 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.158424 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.262326 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.262402 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.262424 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.262455 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.262476 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.365059 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.365826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.366011 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.366204 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.366399 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.468701 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.468754 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.468767 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.468785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.468798 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.571036 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.571068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.571076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.571109 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.571117 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.674041 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.674077 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.674087 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.674103 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.674114 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.777554 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.777645 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.777664 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.777689 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.777732 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.881003 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.881060 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.881080 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.881102 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.881122 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.983208 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.983240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.983248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.983260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:09 crc kubenswrapper[4585]: I0215 17:07:09.983271 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:09Z","lastTransitionTime":"2026-02-15T17:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.053538 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 05:03:27.924836832 +0000 UTC Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.086250 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.086324 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.086352 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.086374 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.086390 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:10Z","lastTransitionTime":"2026-02-15T17:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.189351 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.189414 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.189430 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.189457 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.189474 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:10Z","lastTransitionTime":"2026-02-15T17:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.292219 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.292284 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.292300 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.292335 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.292353 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:10Z","lastTransitionTime":"2026-02-15T17:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.395343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.395713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.396287 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.396481 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.396696 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:10Z","lastTransitionTime":"2026-02-15T17:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.499760 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.499827 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.499849 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.499878 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.499898 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:10Z","lastTransitionTime":"2026-02-15T17:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.602981 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.603032 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.603048 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.603071 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.603089 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:10Z","lastTransitionTime":"2026-02-15T17:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.706381 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.706455 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.706470 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.706488 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.706499 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:10Z","lastTransitionTime":"2026-02-15T17:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.809748 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.809807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.809834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.809857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.809876 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:10Z","lastTransitionTime":"2026-02-15T17:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.841718 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.841804 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:10 crc kubenswrapper[4585]: E0215 17:07:10.842060 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.842101 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.842217 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:10 crc kubenswrapper[4585]: E0215 17:07:10.842278 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:10 crc kubenswrapper[4585]: E0215 17:07:10.842430 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:10 crc kubenswrapper[4585]: E0215 17:07:10.843288 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.912840 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.912898 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.912916 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.912940 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:10 crc kubenswrapper[4585]: I0215 17:07:10.912960 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:10Z","lastTransitionTime":"2026-02-15T17:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.015512 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.015569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.015586 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.015639 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.015655 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.053879 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:16:40.955523149 +0000 UTC Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.118292 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.118367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.118423 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.118451 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.118470 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.221482 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.221549 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.221569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.221594 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.221643 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.325010 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.325047 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.325083 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.325104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.325116 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.428379 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.428458 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.428482 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.428515 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.428537 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.531785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.531856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.531907 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.531935 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.531956 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.635195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.635257 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.635278 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.635310 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.635332 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.738457 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.738513 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.738534 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.738562 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.738583 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.841220 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.841324 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.841344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.841408 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.841426 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.944902 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.945317 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.945462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.945633 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:11 crc kubenswrapper[4585]: I0215 17:07:11.945813 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:11Z","lastTransitionTime":"2026-02-15T17:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.049723 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.049784 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.049804 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.049828 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.049844 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:12Z","lastTransitionTime":"2026-02-15T17:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.054897 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 04:57:33.475209399 +0000 UTC Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.152524 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.152579 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.152591 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.152628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.152640 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:12Z","lastTransitionTime":"2026-02-15T17:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.255990 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.256081 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.256131 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.256158 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.256175 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:12Z","lastTransitionTime":"2026-02-15T17:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.359076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.359118 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.359128 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.359144 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.359156 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:12Z","lastTransitionTime":"2026-02-15T17:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.461804 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.461829 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.461836 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.461848 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.461856 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:12Z","lastTransitionTime":"2026-02-15T17:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.564970 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.565118 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.565144 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.565211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.565237 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:12Z","lastTransitionTime":"2026-02-15T17:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.669340 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.669409 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.669432 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.669461 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.669482 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:12Z","lastTransitionTime":"2026-02-15T17:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.679962 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.680029 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.680047 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.680074 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.680092 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-15T17:07:12Z","lastTransitionTime":"2026-02-15T17:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.764528 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2"] Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.765439 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.768917 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.769002 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.769196 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.769526 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.799443 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=67.799421355 podStartE2EDuration="1m7.799421355s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:12.798525543 +0000 UTC m=+88.741933695" watchObservedRunningTime="2026-02-15 17:07:12.799421355 +0000 UTC m=+88.742829507" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.805931 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/28a2ad85-9870-450c-969e-e8018d880eb2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.806000 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a2ad85-9870-450c-969e-e8018d880eb2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.806048 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28a2ad85-9870-450c-969e-e8018d880eb2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.806100 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/28a2ad85-9870-450c-969e-e8018d880eb2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.806165 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28a2ad85-9870-450c-969e-e8018d880eb2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.825574 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.825550053 podStartE2EDuration="35.825550053s" podCreationTimestamp="2026-02-15 17:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:12.812269954 +0000 UTC m=+88.755678086" watchObservedRunningTime="2026-02-15 17:07:12.825550053 +0000 UTC m=+88.768958225" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.841482 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.841897 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:12 crc kubenswrapper[4585]: E0215 17:07:12.841889 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.841965 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.842027 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:12 crc kubenswrapper[4585]: E0215 17:07:12.842213 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:12 crc kubenswrapper[4585]: E0215 17:07:12.842255 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:12 crc kubenswrapper[4585]: E0215 17:07:12.842320 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.843037 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:07:12 crc kubenswrapper[4585]: E0215 17:07:12.843316 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.858527 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podStartSLOduration=68.858511698 podStartE2EDuration="1m8.858511698s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:12.857939464 +0000 UTC m=+88.801347616" watchObservedRunningTime="2026-02-15 17:07:12.858511698 +0000 UTC m=+88.801919830" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.897683 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.897657957 podStartE2EDuration="1m10.897657957s" podCreationTimestamp="2026-02-15 17:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:12.897045782 +0000 UTC m=+88.840453934" watchObservedRunningTime="2026-02-15 17:07:12.897657957 +0000 UTC m=+88.841066129" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.907096 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28a2ad85-9870-450c-969e-e8018d880eb2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.907148 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/28a2ad85-9870-450c-969e-e8018d880eb2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.907207 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a2ad85-9870-450c-969e-e8018d880eb2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.907235 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28a2ad85-9870-450c-969e-e8018d880eb2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.907301 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/28a2ad85-9870-450c-969e-e8018d880eb2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.907373 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/28a2ad85-9870-450c-969e-e8018d880eb2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.907415 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/28a2ad85-9870-450c-969e-e8018d880eb2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.908180 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28a2ad85-9870-450c-969e-e8018d880eb2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.913715 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28a2ad85-9870-450c-969e-e8018d880eb2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.933797 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7vtnf" podStartSLOduration=68.933775121 podStartE2EDuration="1m8.933775121s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:12.922657056 +0000 UTC m=+88.866065208" watchObservedRunningTime="2026-02-15 17:07:12.933775121 +0000 UTC m=+88.877183263" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.943433 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a2ad85-9870-450c-969e-e8018d880eb2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kxcf2\" (UID: \"28a2ad85-9870-450c-969e-e8018d880eb2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.947882 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n4ps2" podStartSLOduration=68.9478623 podStartE2EDuration="1m8.9478623s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:12.935214447 +0000 UTC m=+88.878622609" watchObservedRunningTime="2026-02-15 17:07:12.9478623 +0000 UTC m=+88.891270442" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.948483 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-klxmj" podStartSLOduration=67.948478116 podStartE2EDuration="1m7.948478116s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:12.948348703 +0000 UTC m=+88.891756865" watchObservedRunningTime="2026-02-15 17:07:12.948478116 +0000 UTC m=+88.891886248" Feb 15 17:07:12 crc kubenswrapper[4585]: I0215 17:07:12.992932 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.992907155 podStartE2EDuration="1m7.992907155s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:12.976081648 +0000 UTC m=+88.919489790" watchObservedRunningTime="2026-02-15 17:07:12.992907155 +0000 UTC m=+88.936315327" Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.024054 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wvfh6" podStartSLOduration=69.024035646 podStartE2EDuration="1m9.024035646s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:13.022146059 +0000 UTC m=+88.965554211" watchObservedRunningTime="2026-02-15 17:07:13.024035646 +0000 UTC m=+88.967443788" Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.050357 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bwj9b" podStartSLOduration=69.050341667 podStartE2EDuration="1m9.050341667s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:13.047901467 +0000 UTC m=+88.991309609" watchObservedRunningTime="2026-02-15 17:07:13.050341667 +0000 UTC m=+88.993749799" Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.058688 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 09:16:37.783642107 +0000 UTC Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.058770 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.072708 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=10.07269002 podStartE2EDuration="10.07269002s" podCreationTimestamp="2026-02-15 17:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:13.072487505 +0000 UTC m=+89.015895647" watchObservedRunningTime="2026-02-15 17:07:13.07269002 +0000 UTC m=+89.016098152" Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.073075 4585 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.093564 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" Feb 15 17:07:13 crc kubenswrapper[4585]: W0215 17:07:13.115877 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a2ad85_9870_450c_969e_e8018d880eb2.slice/crio-7d0197fcf6f98b72435eddd30fbb644703548755cacacf59f28ea00148e50fe1 WatchSource:0}: Error finding container 7d0197fcf6f98b72435eddd30fbb644703548755cacacf59f28ea00148e50fe1: Status 404 returned error can't find the container with id 7d0197fcf6f98b72435eddd30fbb644703548755cacacf59f28ea00148e50fe1 Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.360630 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" event={"ID":"28a2ad85-9870-450c-969e-e8018d880eb2","Type":"ContainerStarted","Data":"5790eb2d5f50d60cc225c4fb7e78866e1f6f19b869a5c3a8f42856f4b0ffc502"} Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.360940 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" event={"ID":"28a2ad85-9870-450c-969e-e8018d880eb2","Type":"ContainerStarted","Data":"7d0197fcf6f98b72435eddd30fbb644703548755cacacf59f28ea00148e50fe1"} Feb 15 17:07:13 crc kubenswrapper[4585]: I0215 17:07:13.381757 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kxcf2" podStartSLOduration=69.38173113 podStartE2EDuration="1m9.38173113s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:13.380545791 +0000 UTC m=+89.323953953" watchObservedRunningTime="2026-02-15 17:07:13.38173113 +0000 UTC m=+89.325139262" Feb 15 17:07:14 crc kubenswrapper[4585]: I0215 17:07:14.841000 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:14 crc kubenswrapper[4585]: I0215 17:07:14.841078 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:14 crc kubenswrapper[4585]: I0215 17:07:14.842924 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:14 crc kubenswrapper[4585]: I0215 17:07:14.842985 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:14 crc kubenswrapper[4585]: E0215 17:07:14.843107 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:14 crc kubenswrapper[4585]: E0215 17:07:14.843359 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:14 crc kubenswrapper[4585]: E0215 17:07:14.843486 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:14 crc kubenswrapper[4585]: E0215 17:07:14.843832 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:16 crc kubenswrapper[4585]: I0215 17:07:16.841667 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:16 crc kubenswrapper[4585]: I0215 17:07:16.841750 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:16 crc kubenswrapper[4585]: E0215 17:07:16.841851 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:16 crc kubenswrapper[4585]: I0215 17:07:16.842005 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:16 crc kubenswrapper[4585]: I0215 17:07:16.841676 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:16 crc kubenswrapper[4585]: E0215 17:07:16.842159 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:16 crc kubenswrapper[4585]: E0215 17:07:16.842201 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:16 crc kubenswrapper[4585]: E0215 17:07:16.842298 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:18 crc kubenswrapper[4585]: I0215 17:07:18.841462 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:18 crc kubenswrapper[4585]: I0215 17:07:18.841468 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:18 crc kubenswrapper[4585]: E0215 17:07:18.841665 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:18 crc kubenswrapper[4585]: E0215 17:07:18.841804 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:18 crc kubenswrapper[4585]: I0215 17:07:18.842090 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:18 crc kubenswrapper[4585]: E0215 17:07:18.842226 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:18 crc kubenswrapper[4585]: I0215 17:07:18.843036 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:18 crc kubenswrapper[4585]: E0215 17:07:18.843491 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:20 crc kubenswrapper[4585]: I0215 17:07:20.840886 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:20 crc kubenswrapper[4585]: E0215 17:07:20.841307 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:20 crc kubenswrapper[4585]: I0215 17:07:20.840973 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:20 crc kubenswrapper[4585]: E0215 17:07:20.841396 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:20 crc kubenswrapper[4585]: I0215 17:07:20.841095 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:20 crc kubenswrapper[4585]: E0215 17:07:20.841450 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:20 crc kubenswrapper[4585]: I0215 17:07:20.840904 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:20 crc kubenswrapper[4585]: E0215 17:07:20.841501 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:22 crc kubenswrapper[4585]: I0215 17:07:22.840933 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:22 crc kubenswrapper[4585]: E0215 17:07:22.841164 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:22 crc kubenswrapper[4585]: I0215 17:07:22.841445 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:22 crc kubenswrapper[4585]: E0215 17:07:22.841531 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:22 crc kubenswrapper[4585]: I0215 17:07:22.841895 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:22 crc kubenswrapper[4585]: I0215 17:07:22.842057 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:22 crc kubenswrapper[4585]: E0215 17:07:22.842161 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:22 crc kubenswrapper[4585]: E0215 17:07:22.842283 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:24 crc kubenswrapper[4585]: I0215 17:07:24.841111 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:24 crc kubenswrapper[4585]: I0215 17:07:24.841287 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:24 crc kubenswrapper[4585]: I0215 17:07:24.841364 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:24 crc kubenswrapper[4585]: I0215 17:07:24.843701 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:24 crc kubenswrapper[4585]: E0215 17:07:24.844022 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:24 crc kubenswrapper[4585]: E0215 17:07:24.844475 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:24 crc kubenswrapper[4585]: E0215 17:07:24.844566 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:24 crc kubenswrapper[4585]: E0215 17:07:24.844675 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:24 crc kubenswrapper[4585]: I0215 17:07:24.848067 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:24 crc kubenswrapper[4585]: E0215 17:07:24.848287 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:07:24 crc kubenswrapper[4585]: E0215 17:07:24.848382 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs podName:ee2e2535-c7ad-42e7-930b-8e0471dfca11 nodeName:}" failed. No retries permitted until 2026-02-15 17:08:28.848357567 +0000 UTC m=+164.791765729 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs") pod "network-metrics-daemon-gclkf" (UID: "ee2e2535-c7ad-42e7-930b-8e0471dfca11") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 15 17:07:26 crc kubenswrapper[4585]: I0215 17:07:26.841477 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:26 crc kubenswrapper[4585]: I0215 17:07:26.841586 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:26 crc kubenswrapper[4585]: I0215 17:07:26.841752 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:26 crc kubenswrapper[4585]: I0215 17:07:26.841818 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:26 crc kubenswrapper[4585]: E0215 17:07:26.841914 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:26 crc kubenswrapper[4585]: E0215 17:07:26.841992 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:26 crc kubenswrapper[4585]: E0215 17:07:26.842108 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:26 crc kubenswrapper[4585]: E0215 17:07:26.842253 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:27 crc kubenswrapper[4585]: I0215 17:07:27.842295 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:07:27 crc kubenswrapper[4585]: E0215 17:07:27.842868 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:07:28 crc kubenswrapper[4585]: I0215 17:07:28.841250 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:28 crc kubenswrapper[4585]: I0215 17:07:28.841339 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:28 crc kubenswrapper[4585]: I0215 17:07:28.841282 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:28 crc kubenswrapper[4585]: I0215 17:07:28.841267 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:28 crc kubenswrapper[4585]: E0215 17:07:28.841438 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:28 crc kubenswrapper[4585]: E0215 17:07:28.841632 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:28 crc kubenswrapper[4585]: E0215 17:07:28.841799 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:28 crc kubenswrapper[4585]: E0215 17:07:28.842156 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:30 crc kubenswrapper[4585]: I0215 17:07:30.840656 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:30 crc kubenswrapper[4585]: I0215 17:07:30.840737 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:30 crc kubenswrapper[4585]: I0215 17:07:30.840778 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:30 crc kubenswrapper[4585]: I0215 17:07:30.840799 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:30 crc kubenswrapper[4585]: E0215 17:07:30.840851 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:30 crc kubenswrapper[4585]: E0215 17:07:30.840998 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:30 crc kubenswrapper[4585]: E0215 17:07:30.841136 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:30 crc kubenswrapper[4585]: E0215 17:07:30.841299 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:32 crc kubenswrapper[4585]: I0215 17:07:32.841089 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:32 crc kubenswrapper[4585]: I0215 17:07:32.841152 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:32 crc kubenswrapper[4585]: I0215 17:07:32.841089 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:32 crc kubenswrapper[4585]: I0215 17:07:32.841230 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:32 crc kubenswrapper[4585]: E0215 17:07:32.841370 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:32 crc kubenswrapper[4585]: E0215 17:07:32.841623 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:32 crc kubenswrapper[4585]: E0215 17:07:32.841803 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:32 crc kubenswrapper[4585]: E0215 17:07:32.841955 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:34 crc kubenswrapper[4585]: I0215 17:07:34.840847 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:34 crc kubenswrapper[4585]: I0215 17:07:34.840914 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:34 crc kubenswrapper[4585]: I0215 17:07:34.840919 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:34 crc kubenswrapper[4585]: E0215 17:07:34.843271 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:34 crc kubenswrapper[4585]: I0215 17:07:34.843419 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:34 crc kubenswrapper[4585]: E0215 17:07:34.843564 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:34 crc kubenswrapper[4585]: E0215 17:07:34.843883 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:34 crc kubenswrapper[4585]: E0215 17:07:34.844139 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:36 crc kubenswrapper[4585]: I0215 17:07:36.841369 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:36 crc kubenswrapper[4585]: I0215 17:07:36.841423 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:36 crc kubenswrapper[4585]: I0215 17:07:36.841513 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:36 crc kubenswrapper[4585]: I0215 17:07:36.841523 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:36 crc kubenswrapper[4585]: E0215 17:07:36.841651 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:36 crc kubenswrapper[4585]: E0215 17:07:36.841850 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:36 crc kubenswrapper[4585]: E0215 17:07:36.842041 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:36 crc kubenswrapper[4585]: E0215 17:07:36.842096 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:38 crc kubenswrapper[4585]: I0215 17:07:38.841415 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:38 crc kubenswrapper[4585]: I0215 17:07:38.841430 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:38 crc kubenswrapper[4585]: E0215 17:07:38.842751 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:38 crc kubenswrapper[4585]: I0215 17:07:38.841494 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:38 crc kubenswrapper[4585]: E0215 17:07:38.842816 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:38 crc kubenswrapper[4585]: I0215 17:07:38.841479 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:38 crc kubenswrapper[4585]: E0215 17:07:38.843400 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:38 crc kubenswrapper[4585]: E0215 17:07:38.843204 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:39 crc kubenswrapper[4585]: I0215 17:07:39.456660 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/1.log" Feb 15 17:07:39 crc kubenswrapper[4585]: I0215 17:07:39.457231 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/0.log" Feb 15 17:07:39 crc kubenswrapper[4585]: I0215 17:07:39.457285 4585 generic.go:334] "Generic (PLEG): container finished" podID="70645395-8d49-4495-a647-b6d43206ecbc" containerID="8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8" exitCode=1 Feb 15 17:07:39 crc kubenswrapper[4585]: I0215 17:07:39.457322 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4ps2" event={"ID":"70645395-8d49-4495-a647-b6d43206ecbc","Type":"ContainerDied","Data":"8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8"} Feb 15 17:07:39 crc kubenswrapper[4585]: I0215 17:07:39.457370 4585 scope.go:117] "RemoveContainer" containerID="e0db0c512e1ef9f6674317694898ca71d83246884343bb31db4c4825bf154156" Feb 15 17:07:39 crc kubenswrapper[4585]: I0215 17:07:39.457872 4585 scope.go:117] "RemoveContainer" containerID="8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8" Feb 15 17:07:39 crc kubenswrapper[4585]: E0215 17:07:39.458102 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-n4ps2_openshift-multus(70645395-8d49-4495-a647-b6d43206ecbc)\"" pod="openshift-multus/multus-n4ps2" podUID="70645395-8d49-4495-a647-b6d43206ecbc" Feb 15 17:07:39 crc kubenswrapper[4585]: I0215 17:07:39.841798 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:07:39 crc kubenswrapper[4585]: E0215 17:07:39.842055 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vp6tl_openshift-ovn-kubernetes(e5acdc04-0978-4907-bd9e-965400ded9bf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" Feb 15 17:07:40 crc kubenswrapper[4585]: I0215 17:07:40.462891 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/1.log" Feb 15 17:07:40 crc kubenswrapper[4585]: I0215 17:07:40.840968 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:40 crc kubenswrapper[4585]: I0215 17:07:40.841127 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:40 crc kubenswrapper[4585]: E0215 17:07:40.841175 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:40 crc kubenswrapper[4585]: E0215 17:07:40.841393 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:40 crc kubenswrapper[4585]: I0215 17:07:40.842069 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:40 crc kubenswrapper[4585]: I0215 17:07:40.842198 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:40 crc kubenswrapper[4585]: E0215 17:07:40.842849 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:40 crc kubenswrapper[4585]: E0215 17:07:40.842937 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:42 crc kubenswrapper[4585]: I0215 17:07:42.841545 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:42 crc kubenswrapper[4585]: E0215 17:07:42.841813 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:42 crc kubenswrapper[4585]: I0215 17:07:42.841911 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:42 crc kubenswrapper[4585]: I0215 17:07:42.841899 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:42 crc kubenswrapper[4585]: E0215 17:07:42.842098 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:42 crc kubenswrapper[4585]: I0215 17:07:42.842300 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:42 crc kubenswrapper[4585]: E0215 17:07:42.842464 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:42 crc kubenswrapper[4585]: E0215 17:07:42.842750 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:44 crc kubenswrapper[4585]: I0215 17:07:44.841080 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:44 crc kubenswrapper[4585]: I0215 17:07:44.841109 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:44 crc kubenswrapper[4585]: I0215 17:07:44.841080 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:44 crc kubenswrapper[4585]: I0215 17:07:44.841163 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:44 crc kubenswrapper[4585]: E0215 17:07:44.843031 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:44 crc kubenswrapper[4585]: E0215 17:07:44.843139 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:44 crc kubenswrapper[4585]: E0215 17:07:44.843233 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:44 crc kubenswrapper[4585]: E0215 17:07:44.843385 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:44 crc kubenswrapper[4585]: E0215 17:07:44.846405 4585 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 15 17:07:44 crc kubenswrapper[4585]: E0215 17:07:44.929758 4585 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 15 17:07:46 crc kubenswrapper[4585]: I0215 17:07:46.841323 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:46 crc kubenswrapper[4585]: I0215 17:07:46.841501 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:46 crc kubenswrapper[4585]: I0215 17:07:46.841501 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:46 crc kubenswrapper[4585]: I0215 17:07:46.841410 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:46 crc kubenswrapper[4585]: E0215 17:07:46.841813 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:46 crc kubenswrapper[4585]: E0215 17:07:46.841902 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:46 crc kubenswrapper[4585]: E0215 17:07:46.842056 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:46 crc kubenswrapper[4585]: E0215 17:07:46.842200 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:48 crc kubenswrapper[4585]: I0215 17:07:48.841509 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:48 crc kubenswrapper[4585]: I0215 17:07:48.841555 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:48 crc kubenswrapper[4585]: I0215 17:07:48.841639 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:48 crc kubenswrapper[4585]: E0215 17:07:48.841771 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:48 crc kubenswrapper[4585]: E0215 17:07:48.841925 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:48 crc kubenswrapper[4585]: E0215 17:07:48.842012 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:48 crc kubenswrapper[4585]: I0215 17:07:48.842420 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:48 crc kubenswrapper[4585]: E0215 17:07:48.842575 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:49 crc kubenswrapper[4585]: E0215 17:07:49.932471 4585 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 15 17:07:50 crc kubenswrapper[4585]: I0215 17:07:50.841099 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:50 crc kubenswrapper[4585]: I0215 17:07:50.841233 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:50 crc kubenswrapper[4585]: I0215 17:07:50.841343 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:50 crc kubenswrapper[4585]: E0215 17:07:50.841365 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:50 crc kubenswrapper[4585]: I0215 17:07:50.841508 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:50 crc kubenswrapper[4585]: E0215 17:07:50.841666 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:50 crc kubenswrapper[4585]: E0215 17:07:50.841781 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:50 crc kubenswrapper[4585]: E0215 17:07:50.841959 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:52 crc kubenswrapper[4585]: I0215 17:07:52.841537 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:52 crc kubenswrapper[4585]: E0215 17:07:52.841791 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:52 crc kubenswrapper[4585]: I0215 17:07:52.841896 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:52 crc kubenswrapper[4585]: I0215 17:07:52.842402 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:52 crc kubenswrapper[4585]: I0215 17:07:52.842444 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:52 crc kubenswrapper[4585]: E0215 17:07:52.842682 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:52 crc kubenswrapper[4585]: E0215 17:07:52.842831 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:52 crc kubenswrapper[4585]: E0215 17:07:52.843029 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:52 crc kubenswrapper[4585]: I0215 17:07:52.843097 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:07:53 crc kubenswrapper[4585]: I0215 17:07:53.527405 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/3.log" Feb 15 17:07:53 crc kubenswrapper[4585]: I0215 17:07:53.529885 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerStarted","Data":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} Feb 15 17:07:53 crc kubenswrapper[4585]: I0215 17:07:53.530700 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:07:53 crc kubenswrapper[4585]: I0215 17:07:53.563317 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podStartSLOduration=108.563305457 podStartE2EDuration="1m48.563305457s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:07:53.561308868 +0000 UTC m=+129.504717000" watchObservedRunningTime="2026-02-15 17:07:53.563305457 +0000 UTC m=+129.506713589" Feb 15 17:07:53 crc kubenswrapper[4585]: I0215 17:07:53.866930 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gclkf"] Feb 15 17:07:53 crc kubenswrapper[4585]: I0215 17:07:53.867068 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:53 crc kubenswrapper[4585]: E0215 17:07:53.867213 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:54 crc kubenswrapper[4585]: I0215 17:07:54.841012 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:54 crc kubenswrapper[4585]: I0215 17:07:54.841524 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:54 crc kubenswrapper[4585]: I0215 17:07:54.850288 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:54 crc kubenswrapper[4585]: I0215 17:07:54.850645 4585 scope.go:117] "RemoveContainer" containerID="8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8" Feb 15 17:07:54 crc kubenswrapper[4585]: E0215 17:07:54.850799 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:54 crc kubenswrapper[4585]: E0215 17:07:54.854945 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:54 crc kubenswrapper[4585]: E0215 17:07:54.855386 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:54 crc kubenswrapper[4585]: E0215 17:07:54.933584 4585 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 15 17:07:55 crc kubenswrapper[4585]: I0215 17:07:55.540118 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/1.log" Feb 15 17:07:55 crc kubenswrapper[4585]: I0215 17:07:55.540228 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4ps2" event={"ID":"70645395-8d49-4495-a647-b6d43206ecbc","Type":"ContainerStarted","Data":"14196f60e816bd7337ce1fe272a79514ddd5bfacb4a1106cdf6530c16feaf6ed"} Feb 15 17:07:55 crc kubenswrapper[4585]: I0215 17:07:55.841008 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:55 crc kubenswrapper[4585]: E0215 17:07:55.841272 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:56 crc kubenswrapper[4585]: I0215 17:07:56.840936 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:56 crc kubenswrapper[4585]: I0215 17:07:56.841175 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:56 crc kubenswrapper[4585]: E0215 17:07:56.841552 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:56 crc kubenswrapper[4585]: I0215 17:07:56.841201 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:56 crc kubenswrapper[4585]: E0215 17:07:56.841942 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:56 crc kubenswrapper[4585]: E0215 17:07:56.842070 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:57 crc kubenswrapper[4585]: I0215 17:07:57.841426 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:57 crc kubenswrapper[4585]: E0215 17:07:57.842813 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:07:58 crc kubenswrapper[4585]: I0215 17:07:58.840848 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:07:58 crc kubenswrapper[4585]: E0215 17:07:58.841120 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 15 17:07:58 crc kubenswrapper[4585]: I0215 17:07:58.841149 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:07:58 crc kubenswrapper[4585]: E0215 17:07:58.841325 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 15 17:07:58 crc kubenswrapper[4585]: I0215 17:07:58.841422 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:07:58 crc kubenswrapper[4585]: E0215 17:07:58.841502 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 15 17:07:59 crc kubenswrapper[4585]: I0215 17:07:59.840706 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:07:59 crc kubenswrapper[4585]: E0215 17:07:59.840911 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gclkf" podUID="ee2e2535-c7ad-42e7-930b-8e0471dfca11" Feb 15 17:08:00 crc kubenswrapper[4585]: I0215 17:08:00.841011 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:08:00 crc kubenswrapper[4585]: I0215 17:08:00.841117 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:08:00 crc kubenswrapper[4585]: I0215 17:08:00.841844 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:08:00 crc kubenswrapper[4585]: I0215 17:08:00.843893 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 15 17:08:00 crc kubenswrapper[4585]: I0215 17:08:00.844033 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 15 17:08:00 crc kubenswrapper[4585]: I0215 17:08:00.844081 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 15 17:08:00 crc kubenswrapper[4585]: I0215 17:08:00.847405 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 15 17:08:01 crc kubenswrapper[4585]: I0215 17:08:01.841429 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:08:01 crc kubenswrapper[4585]: I0215 17:08:01.845967 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 15 17:08:01 crc kubenswrapper[4585]: I0215 17:08:01.846216 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 15 17:08:03 crc kubenswrapper[4585]: I0215 17:08:03.912003 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 15 17:08:03 crc kubenswrapper[4585]: I0215 17:08:03.970422 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-69rgc"] Feb 15 17:08:03 crc kubenswrapper[4585]: I0215 17:08:03.971053 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:03 crc kubenswrapper[4585]: I0215 17:08:03.972770 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qdp6s"] Feb 15 17:08:03 crc kubenswrapper[4585]: I0215 17:08:03.973774 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:03 crc kubenswrapper[4585]: I0215 17:08:03.982363 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h"] Feb 15 17:08:03 crc kubenswrapper[4585]: I0215 17:08:03.982878 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.004531 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gjvln"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.005339 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.013252 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 15 17:08:04 crc kubenswrapper[4585]: W0215 17:08:04.013520 4585 reflector.go:561] object-"openshift-oauth-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 15 17:08:04 crc kubenswrapper[4585]: E0215 17:08:04.013564 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.013669 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.013852 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.014089 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.014130 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.014517 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.014561 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.014782 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.014919 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.015043 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.015156 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.016011 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ncfjb"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.016556 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.017307 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.018150 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.018219 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.018337 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.018479 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.021563 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.022092 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.022416 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.022908 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.023271 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.023563 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.024586 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.024891 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7b5rd"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.025153 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.025232 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.025287 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.025814 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.026388 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-24xpt"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.026703 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-gfx64"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.027011 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.027056 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.027276 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.027338 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.027955 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.028109 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.028320 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.028910 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.036448 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.036999 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.037061 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.037241 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.037353 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.037473 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.037539 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.037691 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.037247 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.037003 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.043952 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.044752 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070042 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-st5w4"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070115 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d76d195a-b0da-4f95-9bc3-a7d9510e749a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070188 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f55299e6-4571-42b7-b96f-35d8612609d2-audit-policies\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070244 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-encryption-config\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070312 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-252xr\" (UniqueName: \"kubernetes.io/projected/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-kube-api-access-252xr\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070367 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-serving-cert\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070407 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-config\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070440 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d76d195a-b0da-4f95-9bc3-a7d9510e749a-images\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070477 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-serving-cert\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070516 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76d195a-b0da-4f95-9bc3-a7d9510e749a-config\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070552 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5mn\" (UniqueName: \"kubernetes.io/projected/d76d195a-b0da-4f95-9bc3-a7d9510e749a-kube-api-access-lj5mn\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070587 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-etcd-client\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070642 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcx6t\" (UniqueName: \"kubernetes.io/projected/f55299e6-4571-42b7-b96f-35d8612609d2-kube-api-access-vcx6t\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070677 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-client-ca\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070719 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55299e6-4571-42b7-b96f-35d8612609d2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070775 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f55299e6-4571-42b7-b96f-35d8612609d2-audit-dir\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070812 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f55299e6-4571-42b7-b96f-35d8612609d2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.070893 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.071011 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.071636 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.071790 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.073417 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.073675 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.073847 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.073919 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.074085 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.074112 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.074198 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.074381 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.074820 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.075093 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.075628 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.088965 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xptvq"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.091214 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-69rgc"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.091313 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.092238 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.110977 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-924b8"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.112095 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.112466 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.113106 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.113418 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-924b8" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.114711 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.117074 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.117582 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.119845 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.117908 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.117980 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.118189 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.118750 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.118786 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.121332 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.121404 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.121482 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.121561 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.121659 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.121737 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.122915 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.123020 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.123097 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.123859 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.126061 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.126265 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.126368 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.126471 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.126579 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.127665 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.127888 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.127907 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.128428 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.128920 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129037 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129143 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129231 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129319 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129443 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129555 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129590 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129559 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129689 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129760 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129816 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129846 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129931 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129972 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.129941 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.130326 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.130724 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.130962 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.133684 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.134283 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nl8js"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.134674 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.134875 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.139802 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.140329 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.140784 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-cfzm8"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.141380 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cfzm8" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.142122 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.143301 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.164691 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.165059 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.165862 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.170198 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.170615 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qdp6s"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.170629 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.171051 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.171266 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.171967 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179309 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-etcd-client\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179346 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcx6t\" (UniqueName: \"kubernetes.io/projected/f55299e6-4571-42b7-b96f-35d8612609d2-kube-api-access-vcx6t\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179383 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-client-ca\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179405 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55299e6-4571-42b7-b96f-35d8612609d2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179435 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f55299e6-4571-42b7-b96f-35d8612609d2-audit-dir\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179456 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f55299e6-4571-42b7-b96f-35d8612609d2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179481 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-policies\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179514 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-config\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179536 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp2v4\" (UniqueName: \"kubernetes.io/projected/f1907e91-47d3-4f5f-b701-bcd299d3b95b-kube-api-access-qp2v4\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179561 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85m28\" (UniqueName: \"kubernetes.io/projected/f8cc3912-1818-43de-94a2-00544a8fd90d-kube-api-access-85m28\") pod \"cluster-samples-operator-665b6dd947-sz7kd\" (UID: \"f8cc3912-1818-43de-94a2-00544a8fd90d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179581 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179622 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179643 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-trusted-ca\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179666 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d76d195a-b0da-4f95-9bc3-a7d9510e749a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179685 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179706 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179749 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f55299e6-4571-42b7-b96f-35d8612609d2-audit-policies\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179784 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179809 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179850 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-encryption-config\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.179877 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps9f4\" (UniqueName: \"kubernetes.io/projected/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-kube-api-access-ps9f4\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180088 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180112 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180149 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-252xr\" (UniqueName: \"kubernetes.io/projected/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-kube-api-access-252xr\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180304 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180346 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-serving-cert\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180369 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-config\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180387 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8cc3912-1818-43de-94a2-00544a8fd90d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-sz7kd\" (UID: \"f8cc3912-1818-43de-94a2-00544a8fd90d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180652 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180682 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d76d195a-b0da-4f95-9bc3-a7d9510e749a-images\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180708 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-serving-cert\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180880 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-serving-cert\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180899 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76d195a-b0da-4f95-9bc3-a7d9510e749a-config\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180921 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.180943 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.181147 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5mn\" (UniqueName: \"kubernetes.io/projected/d76d195a-b0da-4f95-9bc3-a7d9510e749a-kube-api-access-lj5mn\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.181169 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-dir\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.194417 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-etcd-client\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.195421 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-client-ca\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.195992 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.196234 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55299e6-4571-42b7-b96f-35d8612609d2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.196291 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f55299e6-4571-42b7-b96f-35d8612609d2-audit-dir\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.196762 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f55299e6-4571-42b7-b96f-35d8612609d2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.197703 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f55299e6-4571-42b7-b96f-35d8612609d2-audit-policies\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.198111 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.199651 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.201814 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.202976 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d76d195a-b0da-4f95-9bc3-a7d9510e749a-images\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.216277 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-encryption-config\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.218309 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76d195a-b0da-4f95-9bc3-a7d9510e749a-config\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.220671 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-serving-cert\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.221466 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d76d195a-b0da-4f95-9bc3-a7d9510e749a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.221673 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.227982 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.230962 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.243955 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.244590 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.244865 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.245193 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.245379 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.245504 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hq2n4"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.245717 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.245814 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.246253 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.246367 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.246760 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.246963 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-config\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.247136 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.247383 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.247551 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.248047 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.248153 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.251664 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z5d9n"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.252054 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gjvln"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.252132 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.252217 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.253436 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ncfjb"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.253884 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.263107 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.263875 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.264289 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.265249 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.265843 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.270413 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.271224 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.274832 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.277627 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.278627 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.278732 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.279316 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xkhhs"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.280163 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.280790 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.281334 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.281887 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-policies\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.282043 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-config\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.282931 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp2v4\" (UniqueName: \"kubernetes.io/projected/f1907e91-47d3-4f5f-b701-bcd299d3b95b-kube-api-access-qp2v4\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283080 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283162 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85m28\" (UniqueName: \"kubernetes.io/projected/f8cc3912-1818-43de-94a2-00544a8fd90d-kube-api-access-85m28\") pod \"cluster-samples-operator-665b6dd947-sz7kd\" (UID: \"f8cc3912-1818-43de-94a2-00544a8fd90d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283229 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-policies\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283235 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-trusted-ca\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283323 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283357 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283399 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283420 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283442 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps9f4\" (UniqueName: \"kubernetes.io/projected/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-kube-api-access-ps9f4\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283463 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283496 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283538 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283570 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8cc3912-1818-43de-94a2-00544a8fd90d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-sz7kd\" (UID: \"f8cc3912-1818-43de-94a2-00544a8fd90d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283588 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283632 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-serving-cert\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283650 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283668 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283693 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-dir\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.283758 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-dir\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.284628 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-24xpt"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.284682 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gfx64"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.284697 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7b5rd"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.285638 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-config\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.286109 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-trusted-ca\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.286947 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.287526 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.290646 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.292463 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.292689 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.292798 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.293843 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.293886 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-serving-cert\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.294801 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.295446 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.296201 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.298015 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8cc3912-1818-43de-94a2-00544a8fd90d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-sz7kd\" (UID: \"f8cc3912-1818-43de-94a2-00544a8fd90d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.298108 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jt5m5"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.299755 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6vf5k"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.300333 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.300361 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.301521 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.302745 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cfzm8"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.304951 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.305881 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.307122 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-st5w4"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.308518 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.311993 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.313205 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.314073 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.314963 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.317778 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-s2xnb"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.319224 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.321247 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.328700 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.329159 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.333071 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.336400 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.342580 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.344573 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.347436 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hq2n4"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.368900 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xptvq"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.369442 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z5d9n"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.370715 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.372044 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.372653 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.372847 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-924b8"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.376954 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.381909 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.383142 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.385739 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.385778 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xkhhs"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.386614 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kzxfx"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.387296 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kzxfx" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.387907 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.389070 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.390231 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.392022 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.393151 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jt5m5"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.393278 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.395497 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kzxfx"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.396808 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6vf5k"] Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.414803 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.434739 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.453770 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.473089 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.493355 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.513120 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.533452 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.553410 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.572458 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.593839 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.613076 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.634521 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.654086 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.673966 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.693390 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.714839 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.734358 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.774261 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.795497 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.814046 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.833333 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.853575 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.874025 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.893061 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.933489 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.937802 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcx6t\" (UniqueName: \"kubernetes.io/projected/f55299e6-4571-42b7-b96f-35d8612609d2-kube-api-access-vcx6t\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.953559 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.973843 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 15 17:08:04 crc kubenswrapper[4585]: I0215 17:08:04.997194 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.034155 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.040303 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-252xr\" (UniqueName: \"kubernetes.io/projected/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-kube-api-access-252xr\") pod \"controller-manager-879f6c89f-69rgc\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.053540 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.074485 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.093737 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.113922 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.133853 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.153802 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.174463 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.197931 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:05 crc kubenswrapper[4585]: E0215 17:08:05.204073 4585 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 15 17:08:05 crc kubenswrapper[4585]: E0215 17:08:05.204194 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-serving-cert podName:f55299e6-4571-42b7-b96f-35d8612609d2 nodeName:}" failed. No retries permitted until 2026-02-15 17:08:05.704160226 +0000 UTC m=+141.647568528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-serving-cert") pod "apiserver-7bbb656c7d-fg58h" (UID: "f55299e6-4571-42b7-b96f-35d8612609d2") : failed to sync secret cache: timed out waiting for the condition Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.214866 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.227531 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5mn\" (UniqueName: \"kubernetes.io/projected/d76d195a-b0da-4f95-9bc3-a7d9510e749a-kube-api-access-lj5mn\") pod \"machine-api-operator-5694c8668f-qdp6s\" (UID: \"d76d195a-b0da-4f95-9bc3-a7d9510e749a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.234314 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.251668 4585 request.go:700] Waited for 1.004514834s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.253003 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.274567 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.293718 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.316985 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.333262 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.354734 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.375565 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.393843 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.413677 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.432777 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-69rgc"] Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.442355 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.454302 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.473001 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.492890 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.511737 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.533694 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.552972 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.573932 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.593438 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.598800 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-image-import-ca\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.598826 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-config\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.598843 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5419d7c7-80b9-446a-bb53-cfdc7f90a964-machine-approver-tls\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.598860 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-trusted-ca\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.598877 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1b22c6-f790-4713-963f-2a4f2141ac57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jf2jp\" (UID: \"6a1b22c6-f790-4713-963f-2a4f2141ac57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.598903 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a7dbb573-46b2-46e7-8b4c-ca1737c36335-etcd-client\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.598993 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl8nd\" (UniqueName: \"kubernetes.io/projected/9f407773-408b-4d66-a516-59646429f2fb-kube-api-access-zl8nd\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599028 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7dbb573-46b2-46e7-8b4c-ca1737c36335-etcd-service-ca\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599083 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/146b62e3-ce60-47c0-aec8-8811a20bddfe-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599123 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-config\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599189 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-trusted-ca-bundle\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599222 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-etcd-serving-ca\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599251 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/146b62e3-ce60-47c0-aec8-8811a20bddfe-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599266 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/146b62e3-ce60-47c0-aec8-8811a20bddfe-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599285 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzv4\" (UniqueName: \"kubernetes.io/projected/6a1b22c6-f790-4713-963f-2a4f2141ac57-kube-api-access-ngzv4\") pod \"openshift-apiserver-operator-796bbdcf4f-jf2jp\" (UID: \"6a1b22c6-f790-4713-963f-2a4f2141ac57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599303 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9f407773-408b-4d66-a516-59646429f2fb-encryption-config\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599320 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99c50873-2a59-455e-a743-b86618388940-serving-cert\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599337 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c50873-2a59-455e-a743-b86618388940-service-ca-bundle\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599407 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23788c17-8897-4c56-b718-ebf061e5e15c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599753 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-oauth-config\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599783 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smt46\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-kube-api-access-smt46\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599800 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-service-ca\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599814 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8hm7\" (UniqueName: \"kubernetes.io/projected/a7dbb573-46b2-46e7-8b4c-ca1737c36335-kube-api-access-k8hm7\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599940 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7dbb573-46b2-46e7-8b4c-ca1737c36335-serving-cert\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599967 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-audit\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.599983 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-registry-certificates\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600002 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85246\" (UniqueName: \"kubernetes.io/projected/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-kube-api-access-85246\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600030 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600063 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23788c17-8897-4c56-b718-ebf061e5e15c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600090 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7gjd\" (UniqueName: \"kubernetes.io/projected/146b62e3-ce60-47c0-aec8-8811a20bddfe-kube-api-access-l7gjd\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600255 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f407773-408b-4d66-a516-59646429f2fb-etcd-client\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600340 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a7dbb573-46b2-46e7-8b4c-ca1737c36335-etcd-ca\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600415 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-config\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600449 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v262\" (UniqueName: \"kubernetes.io/projected/5419d7c7-80b9-446a-bb53-cfdc7f90a964-kube-api-access-9v262\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600502 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-bound-sa-token\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600533 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c50873-2a59-455e-a743-b86618388940-config\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600565 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p84r\" (UniqueName: \"kubernetes.io/projected/d9c426cb-d8ae-4150-adb2-327d42b7df5b-kube-api-access-7p84r\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600599 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5419d7c7-80b9-446a-bb53-cfdc7f90a964-config\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600655 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f407773-408b-4d66-a516-59646429f2fb-serving-cert\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600725 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-registry-tls\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600759 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-client-ca\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600809 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600850 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c50873-2a59-455e-a743-b86618388940-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600871 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf6f5\" (UniqueName: \"kubernetes.io/projected/99c50873-2a59-455e-a743-b86618388940-kube-api-access-qf6f5\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600912 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a1b22c6-f790-4713-963f-2a4f2141ac57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jf2jp\" (UID: \"6a1b22c6-f790-4713-963f-2a4f2141ac57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600930 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5419d7c7-80b9-446a-bb53-cfdc7f90a964-auth-proxy-config\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.600985 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f407773-408b-4d66-a516-59646429f2fb-node-pullsecrets\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.601001 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9c426cb-d8ae-4150-adb2-327d42b7df5b-serving-cert\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.601074 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a16374fa-419c-416f-86ab-0f18c37da52c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qdjvg\" (UID: \"a16374fa-419c-416f-86ab-0f18c37da52c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.601092 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88vw\" (UniqueName: \"kubernetes.io/projected/a16374fa-419c-416f-86ab-0f18c37da52c-kube-api-access-n88vw\") pod \"openshift-config-operator-7777fb866f-qdjvg\" (UID: \"a16374fa-419c-416f-86ab-0f18c37da52c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.601159 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-serving-cert\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.601214 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7dbb573-46b2-46e7-8b4c-ca1737c36335-config\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.601317 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-oauth-serving-cert\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.601353 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f407773-408b-4d66-a516-59646429f2fb-audit-dir\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: E0215 17:08:05.601380 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.101360804 +0000 UTC m=+142.044768946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.601424 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a16374fa-419c-416f-86ab-0f18c37da52c-serving-cert\") pod \"openshift-config-operator-7777fb866f-qdjvg\" (UID: \"a16374fa-419c-416f-86ab-0f18c37da52c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.607331 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" event={"ID":"fe40fba3-3513-4a78-905a-15ffd6b2f8b2","Type":"ContainerStarted","Data":"a23241966c44a5a88cc68d83613790baa563b81ddf7f94457d7dd8e50f0aa887"} Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.607369 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" event={"ID":"fe40fba3-3513-4a78-905a-15ffd6b2f8b2","Type":"ContainerStarted","Data":"fe34f6a6c6dde9e11fc4d9b275bc4440c877503edf38f80abcb2786ef64c6dc4"} Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.609348 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.609426 4585 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-69rgc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.609452 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" podUID="fe40fba3-3513-4a78-905a-15ffd6b2f8b2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.613741 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.632990 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.652772 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.673421 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.693880 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.703038 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:05 crc kubenswrapper[4585]: E0215 17:08:05.703221 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.203186935 +0000 UTC m=+142.146595067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.703288 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f407773-408b-4d66-a516-59646429f2fb-etcd-client\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.703320 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a7dbb573-46b2-46e7-8b4c-ca1737c36335-etcd-ca\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.703346 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77fb475f-5fd7-4d9e-a81c-9afe54902813-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ndfkp\" (UID: \"77fb475f-5fd7-4d9e-a81c-9afe54902813\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.703371 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbvrj\" (UniqueName: \"kubernetes.io/projected/a3402525-92f4-4cf0-9fee-43faccdc51bd-kube-api-access-zbvrj\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.703389 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-config\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.703407 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v262\" (UniqueName: \"kubernetes.io/projected/5419d7c7-80b9-446a-bb53-cfdc7f90a964-kube-api-access-9v262\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.703425 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6-certs\") pod \"machine-config-server-s2xnb\" (UID: \"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6\") " pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.704789 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-mountpoint-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.704893 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-bound-sa-token\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.704929 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5419d7c7-80b9-446a-bb53-cfdc7f90a964-config\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.704954 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593dc5c4-4044-44c3-bc9c-93b61527da19-config\") pod \"kube-apiserver-operator-766d6c64bb-k9g4d\" (UID: \"593dc5c4-4044-44c3-bc9c-93b61527da19\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.704981 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rt7b\" (UniqueName: \"kubernetes.io/projected/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-kube-api-access-5rt7b\") pod \"marketplace-operator-79b997595-z5d9n\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705023 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/815bdab7-6a3e-4b19-8f83-e6cc5c83461a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mgxft\" (UID: \"815bdab7-6a3e-4b19-8f83-e6cc5c83461a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705049 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-registry-tls\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705073 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-client-ca\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705097 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0c1d0020-9917-42f4-b0af-587a46e23d3e-signing-cabundle\") pod \"service-ca-9c57cc56f-xkhhs\" (UID: \"0c1d0020-9917-42f4-b0af-587a46e23d3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705125 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a1b22c6-f790-4713-963f-2a4f2141ac57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jf2jp\" (UID: \"6a1b22c6-f790-4713-963f-2a4f2141ac57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705147 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5419d7c7-80b9-446a-bb53-cfdc7f90a964-auth-proxy-config\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705177 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c5d13a2-181b-4067-a4f6-dc796ee3e6ca-profile-collector-cert\") pod \"catalog-operator-68c6474976-w5kxz\" (UID: \"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705204 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv8m2\" (UniqueName: \"kubernetes.io/projected/7970ebfe-806a-44c4-9756-6d5bc4903eac-kube-api-access-tv8m2\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705241 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705269 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjn44\" (UniqueName: \"kubernetes.io/projected/b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5-kube-api-access-gjn44\") pod \"machine-config-controller-84d6567774-w62vm\" (UID: \"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705298 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3c2015e-dd19-4448-ae65-bce4f36d35c5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705320 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmzpq\" (UniqueName: \"kubernetes.io/projected/3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6-kube-api-access-kmzpq\") pod \"machine-config-server-s2xnb\" (UID: \"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6\") " pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705345 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9c426cb-d8ae-4150-adb2-327d42b7df5b-serving-cert\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705369 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a7b248c-9d3e-4ed9-802e-e381f76846e4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hq2n4\" (UID: \"3a7b248c-9d3e-4ed9-802e-e381f76846e4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705390 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aad31ad-a5aa-45d1-8eae-75a95aa1407a-serving-cert\") pod \"service-ca-operator-777779d784-vqkgn\" (UID: \"1aad31ad-a5aa-45d1-8eae-75a95aa1407a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705410 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/593dc5c4-4044-44c3-bc9c-93b61527da19-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k9g4d\" (UID: \"593dc5c4-4044-44c3-bc9c-93b61527da19\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705433 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7djsn\" (UniqueName: \"kubernetes.io/projected/de36c82b-03fe-4750-95a9-75c7ee2e68bd-kube-api-access-7djsn\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705455 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70f4a804-22d1-4cdf-a403-d8e81dc3233e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vg82t\" (UID: \"70f4a804-22d1-4cdf-a403-d8e81dc3233e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705477 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd9c9717-dbe1-42fd-9648-e4a8cc4ec882-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-drg89\" (UID: \"dd9c9717-dbe1-42fd-9648-e4a8cc4ec882\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705502 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-serving-cert\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705524 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdn9k\" (UniqueName: \"kubernetes.io/projected/d3c2015e-dd19-4448-ae65-bce4f36d35c5-kube-api-access-fdn9k\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705545 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a11420c-890e-4314-ba16-1867be7c401c-config-volume\") pod \"dns-default-jt5m5\" (UID: \"9a11420c-890e-4314-ba16-1867be7c401c\") " pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705567 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-socket-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705617 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8c5v\" (UniqueName: \"kubernetes.io/projected/64ee5f2d-78e3-4c7b-a1ea-1b38d888d043-kube-api-access-w8c5v\") pod \"dns-operator-744455d44c-924b8\" (UID: \"64ee5f2d-78e3-4c7b-a1ea-1b38d888d043\") " pod="openshift-dns-operator/dns-operator-744455d44c-924b8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705642 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705669 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jb6t\" (UniqueName: \"kubernetes.io/projected/9a11420c-890e-4314-ba16-1867be7c401c-kube-api-access-7jb6t\") pod \"dns-default-jt5m5\" (UID: \"9a11420c-890e-4314-ba16-1867be7c401c\") " pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705694 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a3402525-92f4-4cf0-9fee-43faccdc51bd-stats-auth\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705720 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7dbb573-46b2-46e7-8b4c-ca1737c36335-config\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705741 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7hg4\" (UniqueName: \"kubernetes.io/projected/389330df-47c0-4815-9070-2664655acaab-kube-api-access-d7hg4\") pod \"control-plane-machine-set-operator-78cbb6b69f-zfq7z\" (UID: \"389330df-47c0-4815-9070-2664655acaab\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705767 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5a28822-618f-48f8-bc6d-9f4aa2be4a9f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kcvnt\" (UID: \"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705791 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpgcc\" (UniqueName: \"kubernetes.io/projected/b6b450dc-9948-4b88-b099-3d1aebf653d3-kube-api-access-qpgcc\") pod \"collect-profiles-29519580-zkkws\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705811 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w62vm\" (UID: \"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705837 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-oauth-serving-cert\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705857 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-csi-data-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705866 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a7dbb573-46b2-46e7-8b4c-ca1737c36335-etcd-ca\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705882 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/389330df-47c0-4815-9070-2664655acaab-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zfq7z\" (UID: \"389330df-47c0-4815-9070-2664655acaab\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705909 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b450dc-9948-4b88-b099-3d1aebf653d3-config-volume\") pod \"collect-profiles-29519580-zkkws\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705936 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z5d9n\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.705975 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77fb475f-5fd7-4d9e-a81c-9afe54902813-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ndfkp\" (UID: \"77fb475f-5fd7-4d9e-a81c-9afe54902813\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.704889 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qdp6s"] Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.706510 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5419d7c7-80b9-446a-bb53-cfdc7f90a964-config\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707026 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1b22c6-f790-4713-963f-2a4f2141ac57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jf2jp\" (UID: \"6a1b22c6-f790-4713-963f-2a4f2141ac57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707060 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a11420c-890e-4314-ba16-1867be7c401c-metrics-tls\") pod \"dns-default-jt5m5\" (UID: \"9a11420c-890e-4314-ba16-1867be7c401c\") " pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707090 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5a28822-618f-48f8-bc6d-9f4aa2be4a9f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kcvnt\" (UID: \"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707114 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-trusted-ca\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707138 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4ht5\" (UniqueName: \"kubernetes.io/projected/dd9c9717-dbe1-42fd-9648-e4a8cc4ec882-kube-api-access-z4ht5\") pod \"package-server-manager-789f6589d5-drg89\" (UID: \"dd9c9717-dbe1-42fd-9648-e4a8cc4ec882\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707183 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl8nd\" (UniqueName: \"kubernetes.io/projected/9f407773-408b-4d66-a516-59646429f2fb-kube-api-access-zl8nd\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707206 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7dbb573-46b2-46e7-8b4c-ca1737c36335-etcd-service-ca\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707234 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/146b62e3-ce60-47c0-aec8-8811a20bddfe-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707257 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wlqm\" (UniqueName: \"kubernetes.io/projected/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-kube-api-access-6wlqm\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707282 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3402525-92f4-4cf0-9fee-43faccdc51bd-metrics-certs\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707308 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-etcd-serving-ca\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707338 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/146b62e3-ce60-47c0-aec8-8811a20bddfe-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707362 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9f407773-408b-4d66-a516-59646429f2fb-encryption-config\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707389 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70f4a804-22d1-4cdf-a403-d8e81dc3233e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vg82t\" (UID: \"70f4a804-22d1-4cdf-a403-d8e81dc3233e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707416 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5-proxy-tls\") pod \"machine-config-controller-84d6567774-w62vm\" (UID: \"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707456 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c50873-2a59-455e-a743-b86618388940-service-ca-bundle\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707479 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a28822-618f-48f8-bc6d-9f4aa2be4a9f-config\") pod \"kube-controller-manager-operator-78b949d7b-kcvnt\" (UID: \"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707506 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99c50873-2a59-455e-a743-b86618388940-serving-cert\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707531 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62ffg\" (UniqueName: \"kubernetes.io/projected/3a7b248c-9d3e-4ed9-802e-e381f76846e4-kube-api-access-62ffg\") pod \"multus-admission-controller-857f4d67dd-hq2n4\" (UID: \"3a7b248c-9d3e-4ed9-802e-e381f76846e4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707556 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23788c17-8897-4c56-b718-ebf061e5e15c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707586 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-service-ca\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.707631 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8hm7\" (UniqueName: \"kubernetes.io/projected/a7dbb573-46b2-46e7-8b4c-ca1737c36335-kube-api-access-k8hm7\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708007 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7dbb573-46b2-46e7-8b4c-ca1737c36335-serving-cert\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708034 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhwxd\" (UniqueName: \"kubernetes.io/projected/a1985842-4918-4ebf-ac2a-1a08465d06df-kube-api-access-lhwxd\") pod \"migrator-59844c95c7-5m8q2\" (UID: \"a1985842-4918-4ebf-ac2a-1a08465d06df\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708037 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5419d7c7-80b9-446a-bb53-cfdc7f90a964-auth-proxy-config\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708057 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7970ebfe-806a-44c4-9756-6d5bc4903eac-apiservice-cert\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708154 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-registry-certificates\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708185 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85246\" (UniqueName: \"kubernetes.io/projected/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-kube-api-access-85246\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708214 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-config\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708208 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708266 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b450dc-9948-4b88-b099-3d1aebf653d3-secret-volume\") pod \"collect-profiles-29519580-zkkws\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708291 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gjd\" (UniqueName: \"kubernetes.io/projected/146b62e3-ce60-47c0-aec8-8811a20bddfe-kube-api-access-l7gjd\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708338 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx4sp\" (UniqueName: \"kubernetes.io/projected/1aad31ad-a5aa-45d1-8eae-75a95aa1407a-kube-api-access-tx4sp\") pod \"service-ca-operator-777779d784-vqkgn\" (UID: \"1aad31ad-a5aa-45d1-8eae-75a95aa1407a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708369 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-registration-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708444 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c50873-2a59-455e-a743-b86618388940-config\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708520 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p84r\" (UniqueName: \"kubernetes.io/projected/d9c426cb-d8ae-4150-adb2-327d42b7df5b-kube-api-access-7p84r\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708543 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z5d9n\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708546 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1b22c6-f790-4713-963f-2a4f2141ac57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jf2jp\" (UID: \"6a1b22c6-f790-4713-963f-2a4f2141ac57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708588 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vj45\" (UniqueName: \"kubernetes.io/projected/7c5d13a2-181b-4067-a4f6-dc796ee3e6ca-kube-api-access-4vj45\") pod \"catalog-operator-68c6474976-w5kxz\" (UID: \"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708637 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f407773-408b-4d66-a516-59646429f2fb-serving-cert\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708684 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70f4a804-22d1-4cdf-a403-d8e81dc3233e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vg82t\" (UID: \"70f4a804-22d1-4cdf-a403-d8e81dc3233e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708703 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44qb\" (UniqueName: \"kubernetes.io/projected/0c1d0020-9917-42f4-b0af-587a46e23d3e-kube-api-access-c44qb\") pod \"service-ca-9c57cc56f-xkhhs\" (UID: \"0c1d0020-9917-42f4-b0af-587a46e23d3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708722 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/815bdab7-6a3e-4b19-8f83-e6cc5c83461a-srv-cert\") pod \"olm-operator-6b444d44fb-mgxft\" (UID: \"815bdab7-6a3e-4b19-8f83-e6cc5c83461a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708752 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c50873-2a59-455e-a743-b86618388940-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708772 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf6f5\" (UniqueName: \"kubernetes.io/projected/99c50873-2a59-455e-a743-b86618388940-kube-api-access-qf6f5\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708889 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-metrics-tls\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708907 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/593dc5c4-4044-44c3-bc9c-93b61527da19-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k9g4d\" (UID: \"593dc5c4-4044-44c3-bc9c-93b61527da19\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708935 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a16374fa-419c-416f-86ab-0f18c37da52c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qdjvg\" (UID: \"a16374fa-419c-416f-86ab-0f18c37da52c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708951 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88vw\" (UniqueName: \"kubernetes.io/projected/a16374fa-419c-416f-86ab-0f18c37da52c-kube-api-access-n88vw\") pod \"openshift-config-operator-7777fb866f-qdjvg\" (UID: \"a16374fa-419c-416f-86ab-0f18c37da52c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708976 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f407773-408b-4d66-a516-59646429f2fb-node-pullsecrets\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.708993 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-plugins-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709021 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-882nf\" (UniqueName: \"kubernetes.io/projected/bd15b714-c5de-4649-b152-deaa5e374ec5-kube-api-access-882nf\") pod \"kube-storage-version-migrator-operator-b67b599dd-h2gbs\" (UID: \"bd15b714-c5de-4649-b152-deaa5e374ec5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709040 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73923b56-3ae8-426f-acac-cf6aedb40970-cert\") pod \"ingress-canary-kzxfx\" (UID: \"73923b56-3ae8-426f-acac-cf6aedb40970\") " pod="openshift-ingress-canary/ingress-canary-kzxfx" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709057 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d3c2015e-dd19-4448-ae65-bce4f36d35c5-images\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709067 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a7dbb573-46b2-46e7-8b4c-ca1737c36335-etcd-service-ca\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709085 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a16374fa-419c-416f-86ab-0f18c37da52c-serving-cert\") pod \"openshift-config-operator-7777fb866f-qdjvg\" (UID: \"a16374fa-419c-416f-86ab-0f18c37da52c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709105 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/64ee5f2d-78e3-4c7b-a1ea-1b38d888d043-metrics-tls\") pod \"dns-operator-744455d44c-924b8\" (UID: \"64ee5f2d-78e3-4c7b-a1ea-1b38d888d043\") " pod="openshift-dns-operator/dns-operator-744455d44c-924b8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709125 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng257\" (UniqueName: \"kubernetes.io/projected/73923b56-3ae8-426f-acac-cf6aedb40970-kube-api-access-ng257\") pod \"ingress-canary-kzxfx\" (UID: \"73923b56-3ae8-426f-acac-cf6aedb40970\") " pod="openshift-ingress-canary/ingress-canary-kzxfx" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709143 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4tgk\" (UniqueName: \"kubernetes.io/projected/149cbe83-6c8e-471a-8534-73fa827b39a6-kube-api-access-t4tgk\") pod \"downloads-7954f5f757-cfzm8\" (UID: \"149cbe83-6c8e-471a-8534-73fa827b39a6\") " pod="openshift-console/downloads-7954f5f757-cfzm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709163 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f407773-408b-4d66-a516-59646429f2fb-audit-dir\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709183 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-config\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709200 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5419d7c7-80b9-446a-bb53-cfdc7f90a964-machine-approver-tls\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709218 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-image-import-ca\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709234 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-trusted-ca\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709269 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd15b714-c5de-4649-b152-deaa5e374ec5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-h2gbs\" (UID: \"bd15b714-c5de-4649-b152-deaa5e374ec5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709287 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a3402525-92f4-4cf0-9fee-43faccdc51bd-default-certificate\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709304 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7970ebfe-806a-44c4-9756-6d5bc4903eac-tmpfs\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709321 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a7dbb573-46b2-46e7-8b4c-ca1737c36335-etcd-client\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709337 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3402525-92f4-4cf0-9fee-43faccdc51bd-service-ca-bundle\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709381 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c5d13a2-181b-4067-a4f6-dc796ee3e6ca-srv-cert\") pod \"catalog-operator-68c6474976-w5kxz\" (UID: \"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709401 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-serving-cert\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709420 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0c1d0020-9917-42f4-b0af-587a46e23d3e-signing-key\") pod \"service-ca-9c57cc56f-xkhhs\" (UID: \"0c1d0020-9917-42f4-b0af-587a46e23d3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709443 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-config\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709507 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-registry-certificates\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709696 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-trusted-ca-bundle\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709716 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7970ebfe-806a-44c4-9756-6d5bc4903eac-webhook-cert\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709757 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/146b62e3-ce60-47c0-aec8-8811a20bddfe-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709778 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzv4\" (UniqueName: \"kubernetes.io/projected/6a1b22c6-f790-4713-963f-2a4f2141ac57-kube-api-access-ngzv4\") pod \"openshift-apiserver-operator-796bbdcf4f-jf2jp\" (UID: \"6a1b22c6-f790-4713-963f-2a4f2141ac57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709797 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff7jl\" (UniqueName: \"kubernetes.io/projected/77fb475f-5fd7-4d9e-a81c-9afe54902813-kube-api-access-ff7jl\") pod \"openshift-controller-manager-operator-756b6f6bc6-ndfkp\" (UID: \"77fb475f-5fd7-4d9e-a81c-9afe54902813\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709838 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3c2015e-dd19-4448-ae65-bce4f36d35c5-proxy-tls\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709858 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-oauth-config\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709877 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smt46\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-kube-api-access-smt46\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709928 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-audit\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709948 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6-node-bootstrap-token\") pod \"machine-config-server-s2xnb\" (UID: \"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6\") " pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.709986 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aad31ad-a5aa-45d1-8eae-75a95aa1407a-config\") pod \"service-ca-operator-777779d784-vqkgn\" (UID: \"1aad31ad-a5aa-45d1-8eae-75a95aa1407a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.710003 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd15b714-c5de-4649-b152-deaa5e374ec5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-h2gbs\" (UID: \"bd15b714-c5de-4649-b152-deaa5e374ec5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.710019 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mwf\" (UniqueName: \"kubernetes.io/projected/815bdab7-6a3e-4b19-8f83-e6cc5c83461a-kube-api-access-r7mwf\") pod \"olm-operator-6b444d44fb-mgxft\" (UID: \"815bdab7-6a3e-4b19-8f83-e6cc5c83461a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.710039 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23788c17-8897-4c56-b718-ebf061e5e15c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.710490 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23788c17-8897-4c56-b718-ebf061e5e15c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.710951 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9c426cb-d8ae-4150-adb2-327d42b7df5b-serving-cert\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.711354 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-config\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.711454 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f407773-408b-4d66-a516-59646429f2fb-etcd-client\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.712214 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/146b62e3-ce60-47c0-aec8-8811a20bddfe-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.712475 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-image-import-ca\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.713090 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-registry-tls\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.713678 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.714017 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-service-ca\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.714055 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a1b22c6-f790-4713-963f-2a4f2141ac57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jf2jp\" (UID: \"6a1b22c6-f790-4713-963f-2a4f2141ac57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.714257 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c50873-2a59-455e-a743-b86618388940-config\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: E0215 17:08:05.714270 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.214253667 +0000 UTC m=+142.157661809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.714688 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-etcd-serving-ca\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.714753 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5419d7c7-80b9-446a-bb53-cfdc7f90a964-machine-approver-tls\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.715747 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-trusted-ca\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.716150 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23788c17-8897-4c56-b718-ebf061e5e15c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.716224 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7dbb573-46b2-46e7-8b4c-ca1737c36335-serving-cert\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.716857 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c50873-2a59-455e-a743-b86618388940-service-ca-bundle\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.716918 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-oauth-serving-cert\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.717310 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.721569 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99c50873-2a59-455e-a743-b86618388940-serving-cert\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.721735 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a7dbb573-46b2-46e7-8b4c-ca1737c36335-etcd-client\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.722014 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-serving-cert\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.722186 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f407773-408b-4d66-a516-59646429f2fb-serving-cert\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.722823 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-trusted-ca-bundle\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.723096 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9f407773-408b-4d66-a516-59646429f2fb-audit\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.723871 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/146b62e3-ce60-47c0-aec8-8811a20bddfe-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.725115 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9f407773-408b-4d66-a516-59646429f2fb-audit-dir\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.725346 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9f407773-408b-4d66-a516-59646429f2fb-encryption-config\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.727499 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7dbb573-46b2-46e7-8b4c-ca1737c36335-config\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.727555 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c50873-2a59-455e-a743-b86618388940-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.728797 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-oauth-config\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.729405 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a16374fa-419c-416f-86ab-0f18c37da52c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qdjvg\" (UID: \"a16374fa-419c-416f-86ab-0f18c37da52c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.729436 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f407773-408b-4d66-a516-59646429f2fb-node-pullsecrets\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.730111 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-config\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.730831 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-client-ca\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.733013 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.738228 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a16374fa-419c-416f-86ab-0f18c37da52c-serving-cert\") pod \"openshift-config-operator-7777fb866f-qdjvg\" (UID: \"a16374fa-419c-416f-86ab-0f18c37da52c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.753536 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.773671 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.795101 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811225 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:05 crc kubenswrapper[4585]: E0215 17:08:05.811375 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.311345468 +0000 UTC m=+142.254753640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811481 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-plugins-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811525 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d3c2015e-dd19-4448-ae65-bce4f36d35c5-images\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811561 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-882nf\" (UniqueName: \"kubernetes.io/projected/bd15b714-c5de-4649-b152-deaa5e374ec5-kube-api-access-882nf\") pod \"kube-storage-version-migrator-operator-b67b599dd-h2gbs\" (UID: \"bd15b714-c5de-4649-b152-deaa5e374ec5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811594 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73923b56-3ae8-426f-acac-cf6aedb40970-cert\") pod \"ingress-canary-kzxfx\" (UID: \"73923b56-3ae8-426f-acac-cf6aedb40970\") " pod="openshift-ingress-canary/ingress-canary-kzxfx" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811685 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4tgk\" (UniqueName: \"kubernetes.io/projected/149cbe83-6c8e-471a-8534-73fa827b39a6-kube-api-access-t4tgk\") pod \"downloads-7954f5f757-cfzm8\" (UID: \"149cbe83-6c8e-471a-8534-73fa827b39a6\") " pod="openshift-console/downloads-7954f5f757-cfzm8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811722 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/64ee5f2d-78e3-4c7b-a1ea-1b38d888d043-metrics-tls\") pod \"dns-operator-744455d44c-924b8\" (UID: \"64ee5f2d-78e3-4c7b-a1ea-1b38d888d043\") " pod="openshift-dns-operator/dns-operator-744455d44c-924b8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811752 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng257\" (UniqueName: \"kubernetes.io/projected/73923b56-3ae8-426f-acac-cf6aedb40970-kube-api-access-ng257\") pod \"ingress-canary-kzxfx\" (UID: \"73923b56-3ae8-426f-acac-cf6aedb40970\") " pod="openshift-ingress-canary/ingress-canary-kzxfx" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811799 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7970ebfe-806a-44c4-9756-6d5bc4903eac-tmpfs\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811828 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd15b714-c5de-4649-b152-deaa5e374ec5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-h2gbs\" (UID: \"bd15b714-c5de-4649-b152-deaa5e374ec5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.811858 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a3402525-92f4-4cf0-9fee-43faccdc51bd-default-certificate\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.812712 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.813290 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-plugins-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.814310 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d3c2015e-dd19-4448-ae65-bce4f36d35c5-images\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.814763 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3402525-92f4-4cf0-9fee-43faccdc51bd-service-ca-bundle\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.814855 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c5d13a2-181b-4067-a4f6-dc796ee3e6ca-srv-cert\") pod \"catalog-operator-68c6474976-w5kxz\" (UID: \"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.814887 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0c1d0020-9917-42f4-b0af-587a46e23d3e-signing-key\") pod \"service-ca-9c57cc56f-xkhhs\" (UID: \"0c1d0020-9917-42f4-b0af-587a46e23d3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.814912 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7970ebfe-806a-44c4-9756-6d5bc4903eac-webhook-cert\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.814951 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff7jl\" (UniqueName: \"kubernetes.io/projected/77fb475f-5fd7-4d9e-a81c-9afe54902813-kube-api-access-ff7jl\") pod \"openshift-controller-manager-operator-756b6f6bc6-ndfkp\" (UID: \"77fb475f-5fd7-4d9e-a81c-9afe54902813\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.814993 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3c2015e-dd19-4448-ae65-bce4f36d35c5-proxy-tls\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815051 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6-node-bootstrap-token\") pod \"machine-config-server-s2xnb\" (UID: \"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6\") " pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815075 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aad31ad-a5aa-45d1-8eae-75a95aa1407a-config\") pod \"service-ca-operator-777779d784-vqkgn\" (UID: \"1aad31ad-a5aa-45d1-8eae-75a95aa1407a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815103 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd15b714-c5de-4649-b152-deaa5e374ec5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-h2gbs\" (UID: \"bd15b714-c5de-4649-b152-deaa5e374ec5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815127 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mwf\" (UniqueName: \"kubernetes.io/projected/815bdab7-6a3e-4b19-8f83-e6cc5c83461a-kube-api-access-r7mwf\") pod \"olm-operator-6b444d44fb-mgxft\" (UID: \"815bdab7-6a3e-4b19-8f83-e6cc5c83461a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815148 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77fb475f-5fd7-4d9e-a81c-9afe54902813-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ndfkp\" (UID: \"77fb475f-5fd7-4d9e-a81c-9afe54902813\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815171 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbvrj\" (UniqueName: \"kubernetes.io/projected/a3402525-92f4-4cf0-9fee-43faccdc51bd-kube-api-access-zbvrj\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815218 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-mountpoint-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815251 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6-certs\") pod \"machine-config-server-s2xnb\" (UID: \"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6\") " pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815276 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593dc5c4-4044-44c3-bc9c-93b61527da19-config\") pod \"kube-apiserver-operator-766d6c64bb-k9g4d\" (UID: \"593dc5c4-4044-44c3-bc9c-93b61527da19\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815300 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rt7b\" (UniqueName: \"kubernetes.io/projected/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-kube-api-access-5rt7b\") pod \"marketplace-operator-79b997595-z5d9n\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815339 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/815bdab7-6a3e-4b19-8f83-e6cc5c83461a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mgxft\" (UID: \"815bdab7-6a3e-4b19-8f83-e6cc5c83461a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815368 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0c1d0020-9917-42f4-b0af-587a46e23d3e-signing-cabundle\") pod \"service-ca-9c57cc56f-xkhhs\" (UID: \"0c1d0020-9917-42f4-b0af-587a46e23d3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815397 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv8m2\" (UniqueName: \"kubernetes.io/projected/7970ebfe-806a-44c4-9756-6d5bc4903eac-kube-api-access-tv8m2\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815430 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815456 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c5d13a2-181b-4067-a4f6-dc796ee3e6ca-profile-collector-cert\") pod \"catalog-operator-68c6474976-w5kxz\" (UID: \"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815482 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjn44\" (UniqueName: \"kubernetes.io/projected/b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5-kube-api-access-gjn44\") pod \"machine-config-controller-84d6567774-w62vm\" (UID: \"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815506 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmzpq\" (UniqueName: \"kubernetes.io/projected/3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6-kube-api-access-kmzpq\") pod \"machine-config-server-s2xnb\" (UID: \"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6\") " pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815529 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3c2015e-dd19-4448-ae65-bce4f36d35c5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815553 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aad31ad-a5aa-45d1-8eae-75a95aa1407a-serving-cert\") pod \"service-ca-operator-777779d784-vqkgn\" (UID: \"1aad31ad-a5aa-45d1-8eae-75a95aa1407a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815574 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/593dc5c4-4044-44c3-bc9c-93b61527da19-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k9g4d\" (UID: \"593dc5c4-4044-44c3-bc9c-93b61527da19\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815615 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a7b248c-9d3e-4ed9-802e-e381f76846e4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hq2n4\" (UID: \"3a7b248c-9d3e-4ed9-802e-e381f76846e4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815642 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70f4a804-22d1-4cdf-a403-d8e81dc3233e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vg82t\" (UID: \"70f4a804-22d1-4cdf-a403-d8e81dc3233e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815661 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd9c9717-dbe1-42fd-9648-e4a8cc4ec882-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-drg89\" (UID: \"dd9c9717-dbe1-42fd-9648-e4a8cc4ec882\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815684 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7djsn\" (UniqueName: \"kubernetes.io/projected/de36c82b-03fe-4750-95a9-75c7ee2e68bd-kube-api-access-7djsn\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815702 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-socket-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815729 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdn9k\" (UniqueName: \"kubernetes.io/projected/d3c2015e-dd19-4448-ae65-bce4f36d35c5-kube-api-access-fdn9k\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815750 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a11420c-890e-4314-ba16-1867be7c401c-config-volume\") pod \"dns-default-jt5m5\" (UID: \"9a11420c-890e-4314-ba16-1867be7c401c\") " pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815773 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8c5v\" (UniqueName: \"kubernetes.io/projected/64ee5f2d-78e3-4c7b-a1ea-1b38d888d043-kube-api-access-w8c5v\") pod \"dns-operator-744455d44c-924b8\" (UID: \"64ee5f2d-78e3-4c7b-a1ea-1b38d888d043\") " pod="openshift-dns-operator/dns-operator-744455d44c-924b8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815793 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815821 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jb6t\" (UniqueName: \"kubernetes.io/projected/9a11420c-890e-4314-ba16-1867be7c401c-kube-api-access-7jb6t\") pod \"dns-default-jt5m5\" (UID: \"9a11420c-890e-4314-ba16-1867be7c401c\") " pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815846 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a3402525-92f4-4cf0-9fee-43faccdc51bd-stats-auth\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815871 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7hg4\" (UniqueName: \"kubernetes.io/projected/389330df-47c0-4815-9070-2664655acaab-kube-api-access-d7hg4\") pod \"control-plane-machine-set-operator-78cbb6b69f-zfq7z\" (UID: \"389330df-47c0-4815-9070-2664655acaab\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815895 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5a28822-618f-48f8-bc6d-9f4aa2be4a9f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kcvnt\" (UID: \"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815918 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpgcc\" (UniqueName: \"kubernetes.io/projected/b6b450dc-9948-4b88-b099-3d1aebf653d3-kube-api-access-qpgcc\") pod \"collect-profiles-29519580-zkkws\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815948 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w62vm\" (UID: \"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815980 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-csi-data-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816002 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z5d9n\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816031 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/389330df-47c0-4815-9070-2664655acaab-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zfq7z\" (UID: \"389330df-47c0-4815-9070-2664655acaab\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816054 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b450dc-9948-4b88-b099-3d1aebf653d3-config-volume\") pod \"collect-profiles-29519580-zkkws\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816078 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77fb475f-5fd7-4d9e-a81c-9afe54902813-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ndfkp\" (UID: \"77fb475f-5fd7-4d9e-a81c-9afe54902813\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816099 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a11420c-890e-4314-ba16-1867be7c401c-metrics-tls\") pod \"dns-default-jt5m5\" (UID: \"9a11420c-890e-4314-ba16-1867be7c401c\") " pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816117 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4ht5\" (UniqueName: \"kubernetes.io/projected/dd9c9717-dbe1-42fd-9648-e4a8cc4ec882-kube-api-access-z4ht5\") pod \"package-server-manager-789f6589d5-drg89\" (UID: \"dd9c9717-dbe1-42fd-9648-e4a8cc4ec882\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816141 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5a28822-618f-48f8-bc6d-9f4aa2be4a9f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kcvnt\" (UID: \"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816161 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-trusted-ca\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816200 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wlqm\" (UniqueName: \"kubernetes.io/projected/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-kube-api-access-6wlqm\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816225 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3402525-92f4-4cf0-9fee-43faccdc51bd-metrics-certs\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816250 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5-proxy-tls\") pod \"machine-config-controller-84d6567774-w62vm\" (UID: \"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816283 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70f4a804-22d1-4cdf-a403-d8e81dc3233e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vg82t\" (UID: \"70f4a804-22d1-4cdf-a403-d8e81dc3233e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816307 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a28822-618f-48f8-bc6d-9f4aa2be4a9f-config\") pod \"kube-controller-manager-operator-78b949d7b-kcvnt\" (UID: \"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816327 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62ffg\" (UniqueName: \"kubernetes.io/projected/3a7b248c-9d3e-4ed9-802e-e381f76846e4-kube-api-access-62ffg\") pod \"multus-admission-controller-857f4d67dd-hq2n4\" (UID: \"3a7b248c-9d3e-4ed9-802e-e381f76846e4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816374 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhwxd\" (UniqueName: \"kubernetes.io/projected/a1985842-4918-4ebf-ac2a-1a08465d06df-kube-api-access-lhwxd\") pod \"migrator-59844c95c7-5m8q2\" (UID: \"a1985842-4918-4ebf-ac2a-1a08465d06df\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816392 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7970ebfe-806a-44c4-9756-6d5bc4903eac-apiservice-cert\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816411 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b450dc-9948-4b88-b099-3d1aebf653d3-secret-volume\") pod \"collect-profiles-29519580-zkkws\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816444 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx4sp\" (UniqueName: \"kubernetes.io/projected/1aad31ad-a5aa-45d1-8eae-75a95aa1407a-kube-api-access-tx4sp\") pod \"service-ca-operator-777779d784-vqkgn\" (UID: \"1aad31ad-a5aa-45d1-8eae-75a95aa1407a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816444 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd15b714-c5de-4649-b152-deaa5e374ec5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-h2gbs\" (UID: \"bd15b714-c5de-4649-b152-deaa5e374ec5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816508 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-registration-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816561 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z5d9n\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816620 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vj45\" (UniqueName: \"kubernetes.io/projected/7c5d13a2-181b-4067-a4f6-dc796ee3e6ca-kube-api-access-4vj45\") pod \"catalog-operator-68c6474976-w5kxz\" (UID: \"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816648 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70f4a804-22d1-4cdf-a403-d8e81dc3233e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vg82t\" (UID: \"70f4a804-22d1-4cdf-a403-d8e81dc3233e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816677 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c44qb\" (UniqueName: \"kubernetes.io/projected/0c1d0020-9917-42f4-b0af-587a46e23d3e-kube-api-access-c44qb\") pod \"service-ca-9c57cc56f-xkhhs\" (UID: \"0c1d0020-9917-42f4-b0af-587a46e23d3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816700 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/815bdab7-6a3e-4b19-8f83-e6cc5c83461a-srv-cert\") pod \"olm-operator-6b444d44fb-mgxft\" (UID: \"815bdab7-6a3e-4b19-8f83-e6cc5c83461a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816723 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/593dc5c4-4044-44c3-bc9c-93b61527da19-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k9g4d\" (UID: \"593dc5c4-4044-44c3-bc9c-93b61527da19\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816756 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-metrics-tls\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.818488 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-mountpoint-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.818659 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/64ee5f2d-78e3-4c7b-a1ea-1b38d888d043-metrics-tls\") pod \"dns-operator-744455d44c-924b8\" (UID: \"64ee5f2d-78e3-4c7b-a1ea-1b38d888d043\") " pod="openshift-dns-operator/dns-operator-744455d44c-924b8" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.816196 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aad31ad-a5aa-45d1-8eae-75a95aa1407a-config\") pod \"service-ca-operator-777779d784-vqkgn\" (UID: \"1aad31ad-a5aa-45d1-8eae-75a95aa1407a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.815793 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7970ebfe-806a-44c4-9756-6d5bc4903eac-tmpfs\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.819205 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3c2015e-dd19-4448-ae65-bce4f36d35c5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.819327 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/593dc5c4-4044-44c3-bc9c-93b61527da19-config\") pod \"kube-apiserver-operator-766d6c64bb-k9g4d\" (UID: \"593dc5c4-4044-44c3-bc9c-93b61527da19\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.820085 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b450dc-9948-4b88-b099-3d1aebf653d3-config-volume\") pod \"collect-profiles-29519580-zkkws\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.820467 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5a28822-618f-48f8-bc6d-9f4aa2be4a9f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kcvnt\" (UID: \"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.820712 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-trusted-ca\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.820901 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-csi-data-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: E0215 17:08:05.821037 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.321021784 +0000 UTC m=+142.264429906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.821296 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w62vm\" (UID: \"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.821304 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0c1d0020-9917-42f4-b0af-587a46e23d3e-signing-cabundle\") pod \"service-ca-9c57cc56f-xkhhs\" (UID: \"0c1d0020-9917-42f4-b0af-587a46e23d3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.823079 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/389330df-47c0-4815-9070-2664655acaab-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zfq7z\" (UID: \"389330df-47c0-4815-9070-2664655acaab\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.823192 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z5d9n\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.823710 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3c2015e-dd19-4448-ae65-bce4f36d35c5-proxy-tls\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.824007 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77fb475f-5fd7-4d9e-a81c-9afe54902813-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ndfkp\" (UID: \"77fb475f-5fd7-4d9e-a81c-9afe54902813\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.824046 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-socket-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.825217 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd15b714-c5de-4649-b152-deaa5e374ec5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-h2gbs\" (UID: \"bd15b714-c5de-4649-b152-deaa5e374ec5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.825823 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/815bdab7-6a3e-4b19-8f83-e6cc5c83461a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mgxft\" (UID: \"815bdab7-6a3e-4b19-8f83-e6cc5c83461a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.825838 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a7b248c-9d3e-4ed9-802e-e381f76846e4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hq2n4\" (UID: \"3a7b248c-9d3e-4ed9-802e-e381f76846e4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.826238 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/815bdab7-6a3e-4b19-8f83-e6cc5c83461a-srv-cert\") pod \"olm-operator-6b444d44fb-mgxft\" (UID: \"815bdab7-6a3e-4b19-8f83-e6cc5c83461a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.826272 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7970ebfe-806a-44c4-9756-6d5bc4903eac-apiservice-cert\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.826681 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/de36c82b-03fe-4750-95a9-75c7ee2e68bd-registration-dir\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.826699 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-metrics-tls\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.826985 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3402525-92f4-4cf0-9fee-43faccdc51bd-metrics-certs\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.827172 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3402525-92f4-4cf0-9fee-43faccdc51bd-service-ca-bundle\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.827957 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a3402525-92f4-4cf0-9fee-43faccdc51bd-stats-auth\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.828528 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd9c9717-dbe1-42fd-9648-e4a8cc4ec882-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-drg89\" (UID: \"dd9c9717-dbe1-42fd-9648-e4a8cc4ec882\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.828678 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aad31ad-a5aa-45d1-8eae-75a95aa1407a-serving-cert\") pod \"service-ca-operator-777779d784-vqkgn\" (UID: \"1aad31ad-a5aa-45d1-8eae-75a95aa1407a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.828785 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0c1d0020-9917-42f4-b0af-587a46e23d3e-signing-key\") pod \"service-ca-9c57cc56f-xkhhs\" (UID: \"0c1d0020-9917-42f4-b0af-587a46e23d3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.828783 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5-proxy-tls\") pod \"machine-config-controller-84d6567774-w62vm\" (UID: \"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.829037 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b450dc-9948-4b88-b099-3d1aebf653d3-secret-volume\") pod \"collect-profiles-29519580-zkkws\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.829448 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c5d13a2-181b-4067-a4f6-dc796ee3e6ca-srv-cert\") pod \"catalog-operator-68c6474976-w5kxz\" (UID: \"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.830304 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a3402525-92f4-4cf0-9fee-43faccdc51bd-default-certificate\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.830793 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c5d13a2-181b-4067-a4f6-dc796ee3e6ca-profile-collector-cert\") pod \"catalog-operator-68c6474976-w5kxz\" (UID: \"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.833126 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.878659 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp2v4\" (UniqueName: \"kubernetes.io/projected/f1907e91-47d3-4f5f-b701-bcd299d3b95b-kube-api-access-qp2v4\") pod \"oauth-openshift-558db77b4-ncfjb\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.879871 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.888579 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps9f4\" (UniqueName: \"kubernetes.io/projected/5b8c3acb-bb33-4cca-af37-4aad3f51f5a0-kube-api-access-ps9f4\") pod \"console-operator-58897d9998-24xpt\" (UID: \"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0\") " pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.889928 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77fb475f-5fd7-4d9e-a81c-9afe54902813-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ndfkp\" (UID: \"77fb475f-5fd7-4d9e-a81c-9afe54902813\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.890194 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70f4a804-22d1-4cdf-a403-d8e81dc3233e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vg82t\" (UID: \"70f4a804-22d1-4cdf-a403-d8e81dc3233e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.890486 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70f4a804-22d1-4cdf-a403-d8e81dc3233e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vg82t\" (UID: \"70f4a804-22d1-4cdf-a403-d8e81dc3233e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.891403 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a28822-618f-48f8-bc6d-9f4aa2be4a9f-config\") pod \"kube-controller-manager-operator-78b949d7b-kcvnt\" (UID: \"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.891566 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z5d9n\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.891199 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7970ebfe-806a-44c4-9756-6d5bc4903eac-webhook-cert\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.899634 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/593dc5c4-4044-44c3-bc9c-93b61527da19-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k9g4d\" (UID: \"593dc5c4-4044-44c3-bc9c-93b61527da19\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.907589 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85m28\" (UniqueName: \"kubernetes.io/projected/f8cc3912-1818-43de-94a2-00544a8fd90d-kube-api-access-85m28\") pod \"cluster-samples-operator-665b6dd947-sz7kd\" (UID: \"f8cc3912-1818-43de-94a2-00544a8fd90d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.915777 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.918318 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:05 crc kubenswrapper[4585]: E0215 17:08:05.918473 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.418450743 +0000 UTC m=+142.361858895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.918866 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:05 crc kubenswrapper[4585]: E0215 17:08:05.919158 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.419146861 +0000 UTC m=+142.362555003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.931614 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.932905 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.939244 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a11420c-890e-4314-ba16-1867be7c401c-config-volume\") pod \"dns-default-jt5m5\" (UID: \"9a11420c-890e-4314-ba16-1867be7c401c\") " pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.953951 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.966811 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a11420c-890e-4314-ba16-1867be7c401c-metrics-tls\") pod \"dns-default-jt5m5\" (UID: \"9a11420c-890e-4314-ba16-1867be7c401c\") " pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:05 crc kubenswrapper[4585]: I0215 17:08:05.974562 4585 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.019907 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.020442 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.520428309 +0000 UTC m=+142.463836431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.020592 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.022520 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.033182 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.044513 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6-certs\") pod \"machine-config-server-s2xnb\" (UID: \"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6\") " pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.055530 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.059158 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.073768 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.081467 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6-node-bootstrap-token\") pod \"machine-config-server-s2xnb\" (UID: \"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6\") " pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.097374 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.112487 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.122550 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ncfjb"] Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.124849 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.125255 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.625242496 +0000 UTC m=+142.568650638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.135700 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 15 17:08:06 crc kubenswrapper[4585]: W0215 17:08:06.142784 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1907e91_47d3_4f5f_b701_bcd299d3b95b.slice/crio-2c7c6ae4eebd0cec2675bd64fadc66efcf12e5742fa0504e760f0e320abed512 WatchSource:0}: Error finding container 2c7c6ae4eebd0cec2675bd64fadc66efcf12e5742fa0504e760f0e320abed512: Status 404 returned error can't find the container with id 2c7c6ae4eebd0cec2675bd64fadc66efcf12e5742fa0504e760f0e320abed512 Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.148774 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd"] Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.153991 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.158128 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73923b56-3ae8-426f-acac-cf6aedb40970-cert\") pod \"ingress-canary-kzxfx\" (UID: \"73923b56-3ae8-426f-acac-cf6aedb40970\") " pod="openshift-ingress-canary/ingress-canary-kzxfx" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.192813 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.202253 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f55299e6-4571-42b7-b96f-35d8612609d2-serving-cert\") pod \"apiserver-7bbb656c7d-fg58h\" (UID: \"f55299e6-4571-42b7-b96f-35d8612609d2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.232329 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.236301 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.736278111 +0000 UTC m=+142.679686243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.241137 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl8nd\" (UniqueName: \"kubernetes.io/projected/9f407773-408b-4d66-a516-59646429f2fb-kube-api-access-zl8nd\") pod \"apiserver-76f77b778f-gjvln\" (UID: \"9f407773-408b-4d66-a516-59646429f2fb\") " pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.247030 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8hm7\" (UniqueName: \"kubernetes.io/projected/a7dbb573-46b2-46e7-8b4c-ca1737c36335-kube-api-access-k8hm7\") pod \"etcd-operator-b45778765-xptvq\" (UID: \"a7dbb573-46b2-46e7-8b4c-ca1737c36335\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.266830 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf6f5\" (UniqueName: \"kubernetes.io/projected/99c50873-2a59-455e-a743-b86618388940-kube-api-access-qf6f5\") pod \"authentication-operator-69f744f599-7b5rd\" (UID: \"99c50873-2a59-455e-a743-b86618388940\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.286694 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85246\" (UniqueName: \"kubernetes.io/projected/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-kube-api-access-85246\") pod \"console-f9d7485db-gfx64\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.311223 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v262\" (UniqueName: \"kubernetes.io/projected/5419d7c7-80b9-446a-bb53-cfdc7f90a964-kube-api-access-9v262\") pod \"machine-approver-56656f9798-fjk9c\" (UID: \"5419d7c7-80b9-446a-bb53-cfdc7f90a964\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.318338 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-24xpt"] Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.328357 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7gjd\" (UniqueName: \"kubernetes.io/projected/146b62e3-ce60-47c0-aec8-8811a20bddfe-kube-api-access-l7gjd\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:06 crc kubenswrapper[4585]: W0215 17:08:06.333166 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b8c3acb_bb33_4cca_af37_4aad3f51f5a0.slice/crio-40201c476b176012d742b1eac33db4f5f01c882167dcbf52e4b67b7c1bd2babb WatchSource:0}: Error finding container 40201c476b176012d742b1eac33db4f5f01c882167dcbf52e4b67b7c1bd2babb: Status 404 returned error can't find the container with id 40201c476b176012d742b1eac33db4f5f01c882167dcbf52e4b67b7c1bd2babb Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.337463 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.337791 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.837780435 +0000 UTC m=+142.781188567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.346136 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.347276 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p84r\" (UniqueName: \"kubernetes.io/projected/d9c426cb-d8ae-4150-adb2-327d42b7df5b-kube-api-access-7p84r\") pod \"route-controller-manager-6576b87f9c-6jtm8\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.366640 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/146b62e3-ce60-47c0-aec8-8811a20bddfe-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bdstn\" (UID: \"146b62e3-ce60-47c0-aec8-8811a20bddfe\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.380811 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.388138 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-bound-sa-token\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.403334 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.411060 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzv4\" (UniqueName: \"kubernetes.io/projected/6a1b22c6-f790-4713-963f-2a4f2141ac57-kube-api-access-ngzv4\") pod \"openshift-apiserver-operator-796bbdcf4f-jf2jp\" (UID: \"6a1b22c6-f790-4713-963f-2a4f2141ac57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.429789 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smt46\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-kube-api-access-smt46\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.444166 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.444381 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.444851 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:06.944825619 +0000 UTC m=+142.888233971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.447320 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.448170 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88vw\" (UniqueName: \"kubernetes.io/projected/a16374fa-419c-416f-86ab-0f18c37da52c-kube-api-access-n88vw\") pod \"openshift-config-operator-7777fb866f-qdjvg\" (UID: \"a16374fa-419c-416f-86ab-0f18c37da52c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.474020 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4tgk\" (UniqueName: \"kubernetes.io/projected/149cbe83-6c8e-471a-8534-73fa827b39a6-kube-api-access-t4tgk\") pod \"downloads-7954f5f757-cfzm8\" (UID: \"149cbe83-6c8e-471a-8534-73fa827b39a6\") " pod="openshift-console/downloads-7954f5f757-cfzm8" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.486458 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cfzm8" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.489899 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.494254 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-882nf\" (UniqueName: \"kubernetes.io/projected/bd15b714-c5de-4649-b152-deaa5e374ec5-kube-api-access-882nf\") pod \"kube-storage-version-migrator-operator-b67b599dd-h2gbs\" (UID: \"bd15b714-c5de-4649-b152-deaa5e374ec5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.510946 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.516249 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff7jl\" (UniqueName: \"kubernetes.io/projected/77fb475f-5fd7-4d9e-a81c-9afe54902813-kube-api-access-ff7jl\") pod \"openshift-controller-manager-operator-756b6f6bc6-ndfkp\" (UID: \"77fb475f-5fd7-4d9e-a81c-9afe54902813\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.528126 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.531029 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbvrj\" (UniqueName: \"kubernetes.io/projected/a3402525-92f4-4cf0-9fee-43faccdc51bd-kube-api-access-zbvrj\") pod \"router-default-5444994796-nl8js\" (UID: \"a3402525-92f4-4cf0-9fee-43faccdc51bd\") " pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.546256 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.548701 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.048687792 +0000 UTC m=+142.992095924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.549052 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.569097 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.570162 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.571459 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mwf\" (UniqueName: \"kubernetes.io/projected/815bdab7-6a3e-4b19-8f83-e6cc5c83461a-kube-api-access-r7mwf\") pod \"olm-operator-6b444d44fb-mgxft\" (UID: \"815bdab7-6a3e-4b19-8f83-e6cc5c83461a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.574628 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng257\" (UniqueName: \"kubernetes.io/projected/73923b56-3ae8-426f-acac-cf6aedb40970-kube-api-access-ng257\") pod \"ingress-canary-kzxfx\" (UID: \"73923b56-3ae8-426f-acac-cf6aedb40970\") " pod="openshift-ingress-canary/ingress-canary-kzxfx" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.617672 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7hg4\" (UniqueName: \"kubernetes.io/projected/389330df-47c0-4815-9070-2664655acaab-kube-api-access-d7hg4\") pod \"control-plane-machine-set-operator-78cbb6b69f-zfq7z\" (UID: \"389330df-47c0-4815-9070-2664655acaab\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.632487 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx4sp\" (UniqueName: \"kubernetes.io/projected/1aad31ad-a5aa-45d1-8eae-75a95aa1407a-kube-api-access-tx4sp\") pod \"service-ca-operator-777779d784-vqkgn\" (UID: \"1aad31ad-a5aa-45d1-8eae-75a95aa1407a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.640393 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kzxfx" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.643899 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv8m2\" (UniqueName: \"kubernetes.io/projected/7970ebfe-806a-44c4-9756-6d5bc4903eac-kube-api-access-tv8m2\") pod \"packageserver-d55dfcdfc-jp6zp\" (UID: \"7970ebfe-806a-44c4-9756-6d5bc4903eac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.648205 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.648372 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.148328158 +0000 UTC m=+143.091736300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.649311 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.649914 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.149903257 +0000 UTC m=+143.093311389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.656502 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmzpq\" (UniqueName: \"kubernetes.io/projected/3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6-kube-api-access-kmzpq\") pod \"machine-config-server-s2xnb\" (UID: \"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6\") " pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.685307 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjn44\" (UniqueName: \"kubernetes.io/projected/b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5-kube-api-access-gjn44\") pod \"machine-config-controller-84d6567774-w62vm\" (UID: \"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.694603 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wlqm\" (UniqueName: \"kubernetes.io/projected/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-kube-api-access-6wlqm\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.695811 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" event={"ID":"f8cc3912-1818-43de-94a2-00544a8fd90d","Type":"ContainerStarted","Data":"a39deffe1d4109fc21570224a4a6c6cba174b321410f37fe05e09b31b26cff2d"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.695866 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" event={"ID":"f8cc3912-1818-43de-94a2-00544a8fd90d","Type":"ContainerStarted","Data":"136b5667ae8fe5ab92bdcc134eca94df86838fb84dc7b65650e4d7e2dcca66e9"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.695877 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" event={"ID":"f8cc3912-1818-43de-94a2-00544a8fd90d","Type":"ContainerStarted","Data":"06dde21a1fed7dd5d488b7fcb37e37057f4be493bc98cf3d46fd5df5f1067a62"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.700913 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" event={"ID":"f1907e91-47d3-4f5f-b701-bcd299d3b95b","Type":"ContainerStarted","Data":"f760f5a2efa454041b44bed7a9fdcbb2f99f6a2ab573bb209805e0ad158d6adb"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.700955 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" event={"ID":"f1907e91-47d3-4f5f-b701-bcd299d3b95b","Type":"ContainerStarted","Data":"2c7c6ae4eebd0cec2675bd64fadc66efcf12e5742fa0504e760f0e320abed512"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.701735 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.712440 4585 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ncfjb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.712501 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" podUID="f1907e91-47d3-4f5f-b701-bcd299d3b95b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.720292 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vj45\" (UniqueName: \"kubernetes.io/projected/7c5d13a2-181b-4067-a4f6-dc796ee3e6ca-kube-api-access-4vj45\") pod \"catalog-operator-68c6474976-w5kxz\" (UID: \"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.726575 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" event={"ID":"d76d195a-b0da-4f95-9bc3-a7d9510e749a","Type":"ContainerStarted","Data":"56e0e08efe807e3cee420e98f26816b0e0fe578cb02fc336567c264ab95d3ed9"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.726635 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" event={"ID":"d76d195a-b0da-4f95-9bc3-a7d9510e749a","Type":"ContainerStarted","Data":"3e8de1651176763794228799824c8854c068eb4a53e8b519974a0e7e1cf9041e"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.726645 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" event={"ID":"d76d195a-b0da-4f95-9bc3-a7d9510e749a","Type":"ContainerStarted","Data":"12c164e225cc9b6d95abbae7d73d59f324d7e870880ae4f1f627f0035f08c7eb"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.730963 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-24xpt" event={"ID":"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0","Type":"ContainerStarted","Data":"1c237cd008db82d6aedea7b4fd8b1406720ff66ff498a77c7c4c3e991b5a6fdb"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.731196 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.731206 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-24xpt" event={"ID":"5b8c3acb-bb33-4cca-af37-4aad3f51f5a0","Type":"ContainerStarted","Data":"40201c476b176012d742b1eac33db4f5f01c882167dcbf52e4b67b7c1bd2babb"} Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.731260 4585 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-69rgc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.731283 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" podUID="fe40fba3-3513-4a78-905a-15ffd6b2f8b2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.734331 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.737191 4585 patch_prober.go:28] interesting pod/console-operator-58897d9998-24xpt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.737270 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-24xpt" podUID="5b8c3acb-bb33-4cca-af37-4aad3f51f5a0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.737851 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24e3abe1-a96b-4a1c-8fd7-fa28da78822f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5td8z\" (UID: \"24e3abe1-a96b-4a1c-8fd7-fa28da78822f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.746771 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.751721 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.751800 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jb6t\" (UniqueName: \"kubernetes.io/projected/9a11420c-890e-4314-ba16-1867be7c401c-kube-api-access-7jb6t\") pod \"dns-default-jt5m5\" (UID: \"9a11420c-890e-4314-ba16-1867be7c401c\") " pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.752027 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.252015706 +0000 UTC m=+143.195423838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.752887 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.755933 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.255918135 +0000 UTC m=+143.199326267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.770548 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpgcc\" (UniqueName: \"kubernetes.io/projected/b6b450dc-9948-4b88-b099-3d1aebf653d3-kube-api-access-qpgcc\") pod \"collect-profiles-29519580-zkkws\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:06 crc kubenswrapper[4585]: W0215 17:08:06.812547 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3402525_92f4_4cf0_9fee_43faccdc51bd.slice/crio-a31d3a20f8c439078548807e1f5afc1782b0aa588bb2ec4b6d327bfc5cd3a8a1 WatchSource:0}: Error finding container a31d3a20f8c439078548807e1f5afc1782b0aa588bb2ec4b6d327bfc5cd3a8a1: Status 404 returned error can't find the container with id a31d3a20f8c439078548807e1f5afc1782b0aa588bb2ec4b6d327bfc5cd3a8a1 Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.816319 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rt7b\" (UniqueName: \"kubernetes.io/projected/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-kube-api-access-5rt7b\") pod \"marketplace-operator-79b997595-z5d9n\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.817007 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn"] Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.818430 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.826073 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.826982 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4ht5\" (UniqueName: \"kubernetes.io/projected/dd9c9717-dbe1-42fd-9648-e4a8cc4ec882-kube-api-access-z4ht5\") pod \"package-server-manager-789f6589d5-drg89\" (UID: \"dd9c9717-dbe1-42fd-9648-e4a8cc4ec882\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.828849 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62ffg\" (UniqueName: \"kubernetes.io/projected/3a7b248c-9d3e-4ed9-802e-e381f76846e4-kube-api-access-62ffg\") pod \"multus-admission-controller-857f4d67dd-hq2n4\" (UID: \"3a7b248c-9d3e-4ed9-802e-e381f76846e4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.831477 4585 request.go:700] Waited for 1.010338141s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/serviceaccounts/kube-storage-version-migrator-sa/token Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.832848 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.839556 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.846156 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhwxd\" (UniqueName: \"kubernetes.io/projected/a1985842-4918-4ebf-ac2a-1a08465d06df-kube-api-access-lhwxd\") pod \"migrator-59844c95c7-5m8q2\" (UID: \"a1985842-4918-4ebf-ac2a-1a08465d06df\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.853103 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.853258 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.857454 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.857826 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.357811948 +0000 UTC m=+143.301220080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.860907 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.873209 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.874962 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70f4a804-22d1-4cdf-a403-d8e81dc3233e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vg82t\" (UID: \"70f4a804-22d1-4cdf-a403-d8e81dc3233e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.875738 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.884351 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.892526 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44qb\" (UniqueName: \"kubernetes.io/projected/0c1d0020-9917-42f4-b0af-587a46e23d3e-kube-api-access-c44qb\") pod \"service-ca-9c57cc56f-xkhhs\" (UID: \"0c1d0020-9917-42f4-b0af-587a46e23d3e\") " pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.892768 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.903170 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.908164 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.916674 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/593dc5c4-4044-44c3-bc9c-93b61527da19-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k9g4d\" (UID: \"593dc5c4-4044-44c3-bc9c-93b61527da19\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.919430 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-gfx64"] Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.939243 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-s2xnb" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.946824 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h"] Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.948563 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xptvq"] Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.950229 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdn9k\" (UniqueName: \"kubernetes.io/projected/d3c2015e-dd19-4448-ae65-bce4f36d35c5-kube-api-access-fdn9k\") pod \"machine-config-operator-74547568cd-wtss4\" (UID: \"d3c2015e-dd19-4448-ae65-bce4f36d35c5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.958468 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:06 crc kubenswrapper[4585]: E0215 17:08:06.958971 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.458960242 +0000 UTC m=+143.402368374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.967712 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5a28822-618f-48f8-bc6d-9f4aa2be4a9f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kcvnt\" (UID: \"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:06 crc kubenswrapper[4585]: I0215 17:08:06.969218 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7djsn\" (UniqueName: \"kubernetes.io/projected/de36c82b-03fe-4750-95a9-75c7ee2e68bd-kube-api-access-7djsn\") pod \"csi-hostpathplugin-6vf5k\" (UID: \"de36c82b-03fe-4750-95a9-75c7ee2e68bd\") " pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:06.994691 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8c5v\" (UniqueName: \"kubernetes.io/projected/64ee5f2d-78e3-4c7b-a1ea-1b38d888d043-kube-api-access-w8c5v\") pod \"dns-operator-744455d44c-924b8\" (UID: \"64ee5f2d-78e3-4c7b-a1ea-1b38d888d043\") " pod="openshift-dns-operator/dns-operator-744455d44c-924b8" Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.010275 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.021834 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-924b8" Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.059043 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.059444 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.559418958 +0000 UTC m=+143.502827090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.059618 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.059940 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.559930561 +0000 UTC m=+143.503338693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.078453 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.092373 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.099311 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.106535 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.164523 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.166336 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.666308468 +0000 UTC m=+143.609716590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.226763 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.274093 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.274441 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.77442885 +0000 UTC m=+143.717836982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.360665 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs"] Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.361093 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gjvln"] Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.361724 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp"] Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.376107 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.376755 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.876731824 +0000 UTC m=+143.820139956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.377319 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.378024 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.878014566 +0000 UTC m=+143.821422688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.473389 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cfzm8"] Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.481179 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.481495 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:07.981479329 +0000 UTC m=+143.924887461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.544272 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp"] Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.550301 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kzxfx"] Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.550564 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7b5rd"] Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.584185 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.584696 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.084681506 +0000 UTC m=+144.028089638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.686521 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.687117 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.187104681 +0000 UTC m=+144.130512813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.748122 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg"] Feb 15 17:08:07 crc kubenswrapper[4585]: W0215 17:08:07.767284 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a1b22c6_f790_4713_963f_2a4f2141ac57.slice/crio-01c12a22d89e4cee6d358e1fbe145f51546e03943bdb7b4f769bf82a07b7b6c2 WatchSource:0}: Error finding container 01c12a22d89e4cee6d358e1fbe145f51546e03943bdb7b4f769bf82a07b7b6c2: Status 404 returned error can't find the container with id 01c12a22d89e4cee6d358e1fbe145f51546e03943bdb7b4f769bf82a07b7b6c2 Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.772844 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" event={"ID":"146b62e3-ce60-47c0-aec8-8811a20bddfe","Type":"ContainerStarted","Data":"f18d09df5337aa5da1b2aadf0d4d629b8ab83d130b3120365ce7d0e5b1e7ebaa"} Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.789465 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.790099 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.290084112 +0000 UTC m=+144.233492244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.818293 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8"] Feb 15 17:08:07 crc kubenswrapper[4585]: W0215 17:08:07.862731 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c50873_2a59_455e_a743_b86618388940.slice/crio-519d2aea117d435c2c5d2ea4417a450366f4ee1e05837900a3c9ab4b4adb42ae WatchSource:0}: Error finding container 519d2aea117d435c2c5d2ea4417a450366f4ee1e05837900a3c9ab4b4adb42ae: Status 404 returned error can't find the container with id 519d2aea117d435c2c5d2ea4417a450366f4ee1e05837900a3c9ab4b4adb42ae Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.880133 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" event={"ID":"5419d7c7-80b9-446a-bb53-cfdc7f90a964","Type":"ContainerStarted","Data":"35996ff66e5297e7acc51a75234d9f7bf8dc236c8d7faf584cf014bb2ce13b4a"} Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.880319 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" event={"ID":"5419d7c7-80b9-446a-bb53-cfdc7f90a964","Type":"ContainerStarted","Data":"7fd67695c7f5831670877f7181059d30fce62dea52e0efa228b10edca3953143"} Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.892379 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.893065 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.393038142 +0000 UTC m=+144.336446274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:07 crc kubenswrapper[4585]: W0215 17:08:07.898732 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73923b56_3ae8_426f_acac_cf6aedb40970.slice/crio-d655d1210a0b5ed1932256bf82b76bb18fc4c86a7c07c1a6b255d3bf87e6225a WatchSource:0}: Error finding container d655d1210a0b5ed1932256bf82b76bb18fc4c86a7c07c1a6b255d3bf87e6225a: Status 404 returned error can't find the container with id d655d1210a0b5ed1932256bf82b76bb18fc4c86a7c07c1a6b255d3bf87e6225a Feb 15 17:08:07 crc kubenswrapper[4585]: I0215 17:08:07.995945 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:07 crc kubenswrapper[4585]: E0215 17:08:07.996799 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.496785023 +0000 UTC m=+144.440193155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.008152 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.023800 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.031234 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qdp6s" podStartSLOduration=123.031204458 podStartE2EDuration="2m3.031204458s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:08.020521376 +0000 UTC m=+143.963929508" watchObservedRunningTime="2026-02-15 17:08:08.031204458 +0000 UTC m=+143.974612580" Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.069820 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.076278 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" event={"ID":"a7dbb573-46b2-46e7-8b4c-ca1737c36335","Type":"ContainerStarted","Data":"7a61dea4d6f620809654e496b009298b57a261bd6ce59cd10e91916affdfdc23"} Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.100680 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.100897 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.600866241 +0000 UTC m=+144.544274363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.101248 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.102102 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.602059721 +0000 UTC m=+144.545467853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.121769 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" event={"ID":"9f407773-408b-4d66-a516-59646429f2fb","Type":"ContainerStarted","Data":"90ccb2217240b071747570c92214a447a90f85298e0784053e1117a898601446"} Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.152223 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-24xpt" podStartSLOduration=124.152207957 podStartE2EDuration="2m4.152207957s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:08.151849608 +0000 UTC m=+144.095257740" watchObservedRunningTime="2026-02-15 17:08:08.152207957 +0000 UTC m=+144.095616089" Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.183123 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" event={"ID":"f55299e6-4571-42b7-b96f-35d8612609d2","Type":"ContainerStarted","Data":"94170c196e8d60766c106cbba9196b183d16f96c4bc87df6feef279e08d51de9"} Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.209540 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.209874 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.709857695 +0000 UTC m=+144.653265827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.242319 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" event={"ID":"bd15b714-c5de-4649-b152-deaa5e374ec5","Type":"ContainerStarted","Data":"26a6d2e9771797e3dc79533173a4d9e0e08908c6e0afc3a936a639a209a9a9ac"} Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.264872 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" podStartSLOduration=124.264858515 podStartE2EDuration="2m4.264858515s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:08.264074524 +0000 UTC m=+144.207482656" watchObservedRunningTime="2026-02-15 17:08:08.264858515 +0000 UTC m=+144.208266647" Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.277871 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nl8js" event={"ID":"a3402525-92f4-4cf0-9fee-43faccdc51bd","Type":"ContainerStarted","Data":"a31d3a20f8c439078548807e1f5afc1782b0aa588bb2ec4b6d327bfc5cd3a8a1"} Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.316097 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.316350 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.816341265 +0000 UTC m=+144.759749397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.317059 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hq2n4"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.327161 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gfx64" event={"ID":"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c","Type":"ContainerStarted","Data":"0e992700581df520d0d114aaaf6cd92d0f8386a053cd4dc5a8f000c1a2c61642"} Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.345963 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.417717 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.418819 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:08.918805862 +0000 UTC m=+144.862213994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: W0215 17:08:08.451725 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5d13a2_181b_4067_a4f6_dc796ee3e6ca.slice/crio-348d28d40a36f5e408dbf1a650531dc22d8db6b4282e7b02d4404d42a3258439 WatchSource:0}: Error finding container 348d28d40a36f5e408dbf1a650531dc22d8db6b4282e7b02d4404d42a3258439: Status 404 returned error can't find the container with id 348d28d40a36f5e408dbf1a650531dc22d8db6b4282e7b02d4404d42a3258439 Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.471354 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" podStartSLOduration=123.471335939 podStartE2EDuration="2m3.471335939s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:08.452012887 +0000 UTC m=+144.395421019" watchObservedRunningTime="2026-02-15 17:08:08.471335939 +0000 UTC m=+144.414744071" Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.472002 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.520030 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.523971 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.023960438 +0000 UTC m=+144.967368570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.572684 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xkhhs"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.601000 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.622869 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.626576 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.626897 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.126883327 +0000 UTC m=+145.070291459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.687832 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.718285 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sz7kd" podStartSLOduration=124.718270152 podStartE2EDuration="2m4.718270152s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:08.691513672 +0000 UTC m=+144.634921814" watchObservedRunningTime="2026-02-15 17:08:08.718270152 +0000 UTC m=+144.661678284" Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.718703 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z5d9n"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.733923 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.734259 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.234249219 +0000 UTC m=+145.177657341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.749849 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.759848 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:08 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:08 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:08 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.760148 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.834683 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.835082 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.335068405 +0000 UTC m=+145.278476537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.835105 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws"] Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.844730 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp"] Feb 15 17:08:08 crc kubenswrapper[4585]: W0215 17:08:08.859551 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda1e602_f2c2_44f9_8b27_2b4e25eab6d1.slice/crio-7270427d4f263e119dc67dc314cf7d59e02132f3e662cb2e577c45d6d77c57d7 WatchSource:0}: Error finding container 7270427d4f263e119dc67dc314cf7d59e02132f3e662cb2e577c45d6d77c57d7: Status 404 returned error can't find the container with id 7270427d4f263e119dc67dc314cf7d59e02132f3e662cb2e577c45d6d77c57d7 Feb 15 17:08:08 crc kubenswrapper[4585]: I0215 17:08:08.937322 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:08 crc kubenswrapper[4585]: E0215 17:08:08.937756 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.437743448 +0000 UTC m=+145.381151580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:08 crc kubenswrapper[4585]: W0215 17:08:08.969395 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod389330df_47c0_4815_9070_2664655acaab.slice/crio-1b16d369dad7fa92861ca5efaca67d9681e1c6c2b08279e6627bd7a93df423b7 WatchSource:0}: Error finding container 1b16d369dad7fa92861ca5efaca67d9681e1c6c2b08279e6627bd7a93df423b7: Status 404 returned error can't find the container with id 1b16d369dad7fa92861ca5efaca67d9681e1c6c2b08279e6627bd7a93df423b7 Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.007539 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89"] Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.037950 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.038242 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.538229765 +0000 UTC m=+145.481637897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.047498 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z"] Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.131157 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nl8js" podStartSLOduration=124.13114047 podStartE2EDuration="2m4.13114047s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:09.130296308 +0000 UTC m=+145.073704440" watchObservedRunningTime="2026-02-15 17:08:09.13114047 +0000 UTC m=+145.074548602" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.146321 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.146793 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.646780527 +0000 UTC m=+145.590188669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.157668 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4"] Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.158337 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jt5m5"] Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.207752 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-24xpt" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.221137 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-924b8"] Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.251842 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.252124 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.752091077 +0000 UTC m=+145.695499209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.253184 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.253735 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.753723199 +0000 UTC m=+145.697131331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.271056 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t"] Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.294152 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt"] Feb 15 17:08:09 crc kubenswrapper[4585]: W0215 17:08:09.323068 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c2015e_dd19_4448_ae65_bce4f36d35c5.slice/crio-4bd341064f868c4dfc5ba56bb58ea51db716a22bca14677caaf19796fcfa5177 WatchSource:0}: Error finding container 4bd341064f868c4dfc5ba56bb58ea51db716a22bca14677caaf19796fcfa5177: Status 404 returned error can't find the container with id 4bd341064f868c4dfc5ba56bb58ea51db716a22bca14677caaf19796fcfa5177 Feb 15 17:08:09 crc kubenswrapper[4585]: W0215 17:08:09.327900 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64ee5f2d_78e3_4c7b_a1ea_1b38d888d043.slice/crio-e1111194a2b5b1bfc8d58107ebfce85f8b30eacb59b5b8cb3b37c2e0975b5814 WatchSource:0}: Error finding container e1111194a2b5b1bfc8d58107ebfce85f8b30eacb59b5b8cb3b37c2e0975b5814: Status 404 returned error can't find the container with id e1111194a2b5b1bfc8d58107ebfce85f8b30eacb59b5b8cb3b37c2e0975b5814 Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.334341 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" event={"ID":"7970ebfe-806a-44c4-9756-6d5bc4903eac","Type":"ContainerStarted","Data":"4b6c7f7d46a2f80b5a221d0b071862460e200e1fcfa4e53354d99f3891a1734a"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.351439 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" event={"ID":"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca","Type":"ContainerStarted","Data":"348d28d40a36f5e408dbf1a650531dc22d8db6b4282e7b02d4404d42a3258439"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.353707 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.353903 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.853884497 +0000 UTC m=+145.797292639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.354069 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.354306 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.854298718 +0000 UTC m=+145.797706840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.359066 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" event={"ID":"146b62e3-ce60-47c0-aec8-8811a20bddfe","Type":"ContainerStarted","Data":"3dfc015fa47723499ef0d927647c693eee0085eca9881c133b57e59ccc786888"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.365420 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" event={"ID":"dd9c9717-dbe1-42fd-9648-e4a8cc4ec882","Type":"ContainerStarted","Data":"3514778fcb201f11f2e6d8354091114f432663c65f7ae8557d17b7cd856a0c50"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.373579 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" event={"ID":"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5","Type":"ContainerStarted","Data":"5dbf2d5e79452cda152213e1bae547851b40220bfa266a166b41572bc3d69cc5"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.380349 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" event={"ID":"6a1b22c6-f790-4713-963f-2a4f2141ac57","Type":"ContainerStarted","Data":"01c12a22d89e4cee6d358e1fbe145f51546e03943bdb7b4f769bf82a07b7b6c2"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.390066 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" event={"ID":"389330df-47c0-4815-9070-2664655acaab","Type":"ContainerStarted","Data":"1b16d369dad7fa92861ca5efaca67d9681e1c6c2b08279e6627bd7a93df423b7"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.391283 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jt5m5" event={"ID":"9a11420c-890e-4314-ba16-1867be7c401c","Type":"ContainerStarted","Data":"ebebc6ec6f40d9a448cccf2ef3f5f6b8c28b2fcb324da0122362a1191d032494"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.392300 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" event={"ID":"77fb475f-5fd7-4d9e-a81c-9afe54902813","Type":"ContainerStarted","Data":"bf76de876eeb8e15a043cd648c499ffcf0ad6f5df530f1d9cd9edf3ad1129224"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.396808 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nl8js" event={"ID":"a3402525-92f4-4cf0-9fee-43faccdc51bd","Type":"ContainerStarted","Data":"1a72ecbb4efb816b82b5eda459ef9437ee86e8825ff9537e37481b0c6ab42cc5"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.404295 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bdstn" podStartSLOduration=124.40428088 podStartE2EDuration="2m4.40428088s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:09.403978892 +0000 UTC m=+145.347387024" watchObservedRunningTime="2026-02-15 17:08:09.40428088 +0000 UTC m=+145.347689012" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.413393 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6vf5k"] Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.413564 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" event={"ID":"d9c426cb-d8ae-4150-adb2-327d42b7df5b","Type":"ContainerStarted","Data":"5237edb6f5dddf5c4c6340bd30e5f0304b9991ce15682feb9e75020bb611a4a5"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.428809 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" event={"ID":"815bdab7-6a3e-4b19-8f83-e6cc5c83461a","Type":"ContainerStarted","Data":"f6b19e4567c95726810153d2f5d34015a10dee6b52992ffbe5d894c815084da2"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.438646 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" event={"ID":"a7dbb573-46b2-46e7-8b4c-ca1737c36335","Type":"ContainerStarted","Data":"77257febfcbefda7e222958998fbf484c1ac94e5b8c38c5ff9694f89b3b67e88"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.455541 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.457346 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:09.95733145 +0000 UTC m=+145.900739582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.458230 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kzxfx" event={"ID":"73923b56-3ae8-426f-acac-cf6aedb40970","Type":"ContainerStarted","Data":"d655d1210a0b5ed1932256bf82b76bb18fc4c86a7c07c1a6b255d3bf87e6225a"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.477124 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xptvq" podStartSLOduration=124.477106743 podStartE2EDuration="2m4.477106743s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:09.465812856 +0000 UTC m=+145.409220988" watchObservedRunningTime="2026-02-15 17:08:09.477106743 +0000 UTC m=+145.420514875" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.482165 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d"] Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.494138 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" event={"ID":"99c50873-2a59-455e-a743-b86618388940","Type":"ContainerStarted","Data":"621ee5a99b8c1e9a9f42bfb032380699a76cc502be9d02ad04020d636000e16f"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.494175 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" event={"ID":"99c50873-2a59-455e-a743-b86618388940","Type":"ContainerStarted","Data":"519d2aea117d435c2c5d2ea4417a450366f4ee1e05837900a3c9ab4b4adb42ae"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.496370 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" event={"ID":"1aad31ad-a5aa-45d1-8eae-75a95aa1407a","Type":"ContainerStarted","Data":"15a9396e22bce03fefa22b65565e0598c398aaa0761572c714e7fa8a4c016f74"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.497454 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" event={"ID":"0c1d0020-9917-42f4-b0af-587a46e23d3e","Type":"ContainerStarted","Data":"0a8f3983df8e287c5d1c5a3b0f4994a5a118edac2b3ca2d12f0d3784d7b02cd7"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.498402 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" event={"ID":"a16374fa-419c-416f-86ab-0f18c37da52c","Type":"ContainerStarted","Data":"109158eb2370bca0fba0aae794b7df5c4a659ce73c85d99d74c68affcdcd191c"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.499408 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2" event={"ID":"a1985842-4918-4ebf-ac2a-1a08465d06df","Type":"ContainerStarted","Data":"aa43cc66c8e8a106323cc82194e1a648a1b8bf302e02fd9c6f254063553f3298"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.500253 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" event={"ID":"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1","Type":"ContainerStarted","Data":"7270427d4f263e119dc67dc314cf7d59e02132f3e662cb2e577c45d6d77c57d7"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.520505 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" event={"ID":"24e3abe1-a96b-4a1c-8fd7-fa28da78822f","Type":"ContainerStarted","Data":"a118b09bb4bdbbc453ce65b9ddb9b3fa5f321a856f77becaca58fbd05f108d2a"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.534259 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7b5rd" podStartSLOduration=125.534240638 podStartE2EDuration="2m5.534240638s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:09.53236695 +0000 UTC m=+145.475775082" watchObservedRunningTime="2026-02-15 17:08:09.534240638 +0000 UTC m=+145.477648770" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.551857 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" event={"ID":"bd15b714-c5de-4649-b152-deaa5e374ec5","Type":"ContainerStarted","Data":"7269d92a091afb0292c386206e4aefdeb32b1ae399703b6ee8b1fc165c8cef8c"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.562333 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.564410 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.064399544 +0000 UTC m=+146.007807666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.579852 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gfx64" event={"ID":"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c","Type":"ContainerStarted","Data":"e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.586813 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" event={"ID":"b6b450dc-9948-4b88-b099-3d1aebf653d3","Type":"ContainerStarted","Data":"8b941bd3cb337caa0fb12b4a29a07f78f10cfd0cbc40a7130b0812a4e94946d5"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.607810 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cfzm8" event={"ID":"149cbe83-6c8e-471a-8534-73fa827b39a6","Type":"ContainerStarted","Data":"4edf1aa55c789b3ad24ab8be6eb03cc13bca0b4c52ac592310519360447adce1"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.607857 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cfzm8" event={"ID":"149cbe83-6c8e-471a-8534-73fa827b39a6","Type":"ContainerStarted","Data":"9fe18d530529b62fe26e84da34f78ceb1db84064c94a1dbb725e16c79d9b8456"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.608724 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-cfzm8" Feb 15 17:08:09 crc kubenswrapper[4585]: W0215 17:08:09.641404 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod593dc5c4_4044_44c3_bc9c_93b61527da19.slice/crio-3701ec00f63a2d931113c09a03403f14c88afacb49cbef43a39e269ee562a90d WatchSource:0}: Error finding container 3701ec00f63a2d931113c09a03403f14c88afacb49cbef43a39e269ee562a90d: Status 404 returned error can't find the container with id 3701ec00f63a2d931113c09a03403f14c88afacb49cbef43a39e269ee562a90d Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.641525 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfzm8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.641568 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cfzm8" podUID="149cbe83-6c8e-471a-8534-73fa827b39a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.645257 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" event={"ID":"5419d7c7-80b9-446a-bb53-cfdc7f90a964","Type":"ContainerStarted","Data":"91c7f6907c7b516d6fa5ed1270f8add30945d24b18d2fcc1bdd8406cf9e4f75e"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.663353 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.663698 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.163562248 +0000 UTC m=+146.106970380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.664168 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.665775 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.165756045 +0000 UTC m=+146.109164167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.679977 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-s2xnb" event={"ID":"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6","Type":"ContainerStarted","Data":"464f254c4ff72fb82945ef698524e015ddff87d3f7059a78c6c11685c884f601"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.680026 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-s2xnb" event={"ID":"3e7e2fd3-44ef-40a4-9a17-38bcab70c7a6","Type":"ContainerStarted","Data":"58dd446653a45a151a9787825b717e513d69f9b86a5f12fd1ecb49ed4c9e16a1"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.682798 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-h2gbs" podStartSLOduration=124.682757427 podStartE2EDuration="2m4.682757427s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:09.643537689 +0000 UTC m=+145.586945821" watchObservedRunningTime="2026-02-15 17:08:09.682757427 +0000 UTC m=+145.626165559" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.685048 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-cfzm8" podStartSLOduration=125.685041395 podStartE2EDuration="2m5.685041395s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:09.683586628 +0000 UTC m=+145.626994770" watchObservedRunningTime="2026-02-15 17:08:09.685041395 +0000 UTC m=+145.628449527" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.706226 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" event={"ID":"3a7b248c-9d3e-4ed9-802e-e381f76846e4","Type":"ContainerStarted","Data":"3039a5b801d86b5dfad232831be8a94a39d0ae1e0ed61d311bd5ba42fa1826f5"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.715244 4585 generic.go:334] "Generic (PLEG): container finished" podID="f55299e6-4571-42b7-b96f-35d8612609d2" containerID="953f069120a22d454a77ae18d473daa4b1758d92a8219bb1c1e2a51e5b4e0048" exitCode=0 Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.716022 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" event={"ID":"f55299e6-4571-42b7-b96f-35d8612609d2","Type":"ContainerDied","Data":"953f069120a22d454a77ae18d473daa4b1758d92a8219bb1c1e2a51e5b4e0048"} Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.746289 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-gfx64" podStartSLOduration=125.746259842 podStartE2EDuration="2m5.746259842s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:09.745283538 +0000 UTC m=+145.688691680" watchObservedRunningTime="2026-02-15 17:08:09.746259842 +0000 UTC m=+145.689667974" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.756194 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:09 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:09 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:09 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.756268 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.775247 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.780144 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.280100023 +0000 UTC m=+146.223508155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.877925 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.878582 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.37856582 +0000 UTC m=+146.321973942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.897987 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fjk9c" podStartSLOduration=125.897967304 podStartE2EDuration="2m5.897967304s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:09.897891612 +0000 UTC m=+145.841299744" watchObservedRunningTime="2026-02-15 17:08:09.897967304 +0000 UTC m=+145.841375436" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.952808 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-s2xnb" podStartSLOduration=5.952792258 podStartE2EDuration="5.952792258s" podCreationTimestamp="2026-02-15 17:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:09.952173673 +0000 UTC m=+145.895581805" watchObservedRunningTime="2026-02-15 17:08:09.952792258 +0000 UTC m=+145.896200390" Feb 15 17:08:09 crc kubenswrapper[4585]: I0215 17:08:09.979164 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:09 crc kubenswrapper[4585]: E0215 17:08:09.979555 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.479534549 +0000 UTC m=+146.422942681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.081559 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.082257 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.582224542 +0000 UTC m=+146.525632674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.184395 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.184618 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.684571127 +0000 UTC m=+146.627979259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.184831 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.185169 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.685159572 +0000 UTC m=+146.628567704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.287643 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.287881 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.787839495 +0000 UTC m=+146.731247627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.288589 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.289145 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.789130028 +0000 UTC m=+146.732538160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.393910 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.394650 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.894635223 +0000 UTC m=+146.838043355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.496426 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.496799 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:10.996787372 +0000 UTC m=+146.940195504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.598489 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.599066 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:11.099023294 +0000 UTC m=+147.042431426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.700670 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.700990 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:11.200972408 +0000 UTC m=+147.144380540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.760697 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:10 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:10 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:10 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.760771 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.788012 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kzxfx" event={"ID":"73923b56-3ae8-426f-acac-cf6aedb40970","Type":"ContainerStarted","Data":"e073c42d05580404e98652f82cb4d2b16b1478eae32e458ec5a50d7c7a04c93f"} Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.802409 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.802858 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:11.302818911 +0000 UTC m=+147.246227043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.819463 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kzxfx" podStartSLOduration=6.819418582 podStartE2EDuration="6.819418582s" podCreationTimestamp="2026-02-15 17:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:10.818392056 +0000 UTC m=+146.761800188" watchObservedRunningTime="2026-02-15 17:08:10.819418582 +0000 UTC m=+146.762826714" Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.907357 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:10 crc kubenswrapper[4585]: E0215 17:08:10.908869 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:11.408847588 +0000 UTC m=+147.352255720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.969941 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" event={"ID":"1aad31ad-a5aa-45d1-8eae-75a95aa1407a","Type":"ContainerStarted","Data":"8f8a098e36be38c9c86f4b708b3937561e6d88a2e95a006f7813299e8633ee9c"} Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.972274 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2" event={"ID":"a1985842-4918-4ebf-ac2a-1a08465d06df","Type":"ContainerStarted","Data":"fc2de2c715713fe3623716c90db695fa56dd9e715eba2b63b2ddb4458317add7"} Feb 15 17:08:10 crc kubenswrapper[4585]: I0215 17:08:10.991268 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vqkgn" podStartSLOduration=125.991248245 podStartE2EDuration="2m5.991248245s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:10.990391574 +0000 UTC m=+146.933799706" watchObservedRunningTime="2026-02-15 17:08:10.991248245 +0000 UTC m=+146.934656377" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.008702 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" event={"ID":"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1","Type":"ContainerStarted","Data":"854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.008834 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.009656 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.011413 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:11.511389808 +0000 UTC m=+147.454797930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.027269 4585 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z5d9n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.027756 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" podUID="bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.036051 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" event={"ID":"de36c82b-03fe-4750-95a9-75c7ee2e68bd","Type":"ContainerStarted","Data":"486dfbddd665be7a040c24ae022af490a8d4bbfb96c648b6d0d36f0f9cee902e"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.070273 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" podStartSLOduration=126.070231315 podStartE2EDuration="2m6.070231315s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.05941646 +0000 UTC m=+147.002824612" watchObservedRunningTime="2026-02-15 17:08:11.070231315 +0000 UTC m=+147.013639447" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.072472 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" event={"ID":"593dc5c4-4044-44c3-bc9c-93b61527da19","Type":"ContainerStarted","Data":"3701ec00f63a2d931113c09a03403f14c88afacb49cbef43a39e269ee562a90d"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.082816 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" event={"ID":"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5","Type":"ContainerStarted","Data":"6ba849b2355093a854b62971d49f0f630fbae4a2b0368785eb3fbcda9dadc66d"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.084642 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" event={"ID":"815bdab7-6a3e-4b19-8f83-e6cc5c83461a","Type":"ContainerStarted","Data":"82e1ecad8dce5448c415a46c83acf7d9420be7ff90cfe4b9167640295a01a3d1"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.087395 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.111962 4585 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mgxft container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.112036 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" podUID="815bdab7-6a3e-4b19-8f83-e6cc5c83461a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.112533 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.113922 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:11.613905847 +0000 UTC m=+147.557313979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.115753 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" podStartSLOduration=126.115739853 podStartE2EDuration="2m6.115739853s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.113195338 +0000 UTC m=+147.056603470" watchObservedRunningTime="2026-02-15 17:08:11.115739853 +0000 UTC m=+147.059147985" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.130473 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" event={"ID":"7970ebfe-806a-44c4-9756-6d5bc4903eac","Type":"ContainerStarted","Data":"fd140035f8b87217afe8820be6baa69079456d1f5191e39c12531be98adef19a"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.131867 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.140227 4585 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jp6zp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.140307 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" podUID="7970ebfe-806a-44c4-9756-6d5bc4903eac" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.149248 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-924b8" event={"ID":"64ee5f2d-78e3-4c7b-a1ea-1b38d888d043","Type":"ContainerStarted","Data":"e1111194a2b5b1bfc8d58107ebfce85f8b30eacb59b5b8cb3b37c2e0975b5814"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.163646 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" event={"ID":"77fb475f-5fd7-4d9e-a81c-9afe54902813","Type":"ContainerStarted","Data":"3beb6fc399044a50995610405b54d732f5e29bdc82e5b043575c99f415f61e73"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.169205 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" podStartSLOduration=126.169181973 podStartE2EDuration="2m6.169181973s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.166939477 +0000 UTC m=+147.110347609" watchObservedRunningTime="2026-02-15 17:08:11.169181973 +0000 UTC m=+147.112590105" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.189113 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" event={"ID":"7c5d13a2-181b-4067-a4f6-dc796ee3e6ca","Type":"ContainerStarted","Data":"f9b73ff97cc3a0e13d61fe836f9a0831fb86f1c6ec8e9a723c1e08e2ed805d3e"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.189915 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.206035 4585 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-w5kxz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.206109 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" podUID="7c5d13a2-181b-4067-a4f6-dc796ee3e6ca" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.213979 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.216202 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:11.716181039 +0000 UTC m=+147.659589171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.231149 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ndfkp" podStartSLOduration=126.231124259 podStartE2EDuration="2m6.231124259s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.214861346 +0000 UTC m=+147.158269478" watchObservedRunningTime="2026-02-15 17:08:11.231124259 +0000 UTC m=+147.174532391" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.262191 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" event={"ID":"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f","Type":"ContainerStarted","Data":"9eab380030107e47f3e5c58191ae8170b43ac1256ef093c508558289dabdc975"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.279861 4585 generic.go:334] "Generic (PLEG): container finished" podID="a16374fa-419c-416f-86ab-0f18c37da52c" containerID="30dfaeb381dc53740aa1d3bfbfb6616c30e1907022da858e24961cc9318eefa1" exitCode=0 Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.279965 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" event={"ID":"a16374fa-419c-416f-86ab-0f18c37da52c","Type":"ContainerDied","Data":"30dfaeb381dc53740aa1d3bfbfb6616c30e1907022da858e24961cc9318eefa1"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.289029 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" podStartSLOduration=126.28890551 podStartE2EDuration="2m6.28890551s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.277746216 +0000 UTC m=+147.221154348" watchObservedRunningTime="2026-02-15 17:08:11.28890551 +0000 UTC m=+147.232313642" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.307165 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" podStartSLOduration=126.307151885 podStartE2EDuration="2m6.307151885s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.305963724 +0000 UTC m=+147.249371866" watchObservedRunningTime="2026-02-15 17:08:11.307151885 +0000 UTC m=+147.250560017" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.320093 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.320913 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:11.820901895 +0000 UTC m=+147.764310027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.327927 4585 generic.go:334] "Generic (PLEG): container finished" podID="9f407773-408b-4d66-a516-59646429f2fb" containerID="564e4536e2e745070440f05bf2ff30546b7569aced5f37dad6c7df5544cc353b" exitCode=0 Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.328315 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" event={"ID":"9f407773-408b-4d66-a516-59646429f2fb","Type":"ContainerDied","Data":"564e4536e2e745070440f05bf2ff30546b7569aced5f37dad6c7df5544cc353b"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.377663 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" event={"ID":"d3c2015e-dd19-4448-ae65-bce4f36d35c5","Type":"ContainerStarted","Data":"4bd341064f868c4dfc5ba56bb58ea51db716a22bca14677caaf19796fcfa5177"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.404143 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" event={"ID":"0c1d0020-9917-42f4-b0af-587a46e23d3e","Type":"ContainerStarted","Data":"3a19191ff408457db29755f4ad2ec673a1ae7aa17bdce189b5c0fc70f74fbeb3"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.418159 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" event={"ID":"70f4a804-22d1-4cdf-a403-d8e81dc3233e","Type":"ContainerStarted","Data":"cc0e24a6671c125919fa2d0c5a1ef95450be72f288e9f745020372f7b633cce2"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.424153 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.425030 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:11.925015064 +0000 UTC m=+147.868423196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.459062 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" event={"ID":"d9c426cb-d8ae-4150-adb2-327d42b7df5b","Type":"ContainerStarted","Data":"b72af6a523a88d6dc2a537455e69d7a896b5491f2b1da4b72bc772dc97ccf591"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.460258 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.472019 4585 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6jtm8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.472071 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" podUID="d9c426cb-d8ae-4150-adb2-327d42b7df5b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.478074 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" event={"ID":"6a1b22c6-f790-4713-963f-2a4f2141ac57","Type":"ContainerStarted","Data":"19bd029a20b4495a35a19dc28f04ea6d87ae0cd00be26f3b765430bcc0f5e57c"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.511538 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" event={"ID":"b6b450dc-9948-4b88-b099-3d1aebf653d3","Type":"ContainerStarted","Data":"7d44e65eb96a39c541d4c38939ecc3f089cbbd92469a65573e57d7d52f200d86"} Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.517475 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfzm8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.517511 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cfzm8" podUID="149cbe83-6c8e-471a-8534-73fa827b39a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.523462 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xkhhs" podStartSLOduration=126.523447609 podStartE2EDuration="2m6.523447609s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.446221923 +0000 UTC m=+147.389630055" watchObservedRunningTime="2026-02-15 17:08:11.523447609 +0000 UTC m=+147.466855741" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.524149 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" podStartSLOduration=126.524145066 podStartE2EDuration="2m6.524145066s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.51602992 +0000 UTC m=+147.459438052" watchObservedRunningTime="2026-02-15 17:08:11.524145066 +0000 UTC m=+147.467553198" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.525697 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.527473 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.02746062 +0000 UTC m=+147.970868752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.547282 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" podStartSLOduration=126.547265494 podStartE2EDuration="2m6.547265494s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.546057624 +0000 UTC m=+147.489465756" watchObservedRunningTime="2026-02-15 17:08:11.547265494 +0000 UTC m=+147.490673626" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.573111 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jf2jp" podStartSLOduration=127.573090282 podStartE2EDuration="2m7.573090282s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:11.570384933 +0000 UTC m=+147.513793075" watchObservedRunningTime="2026-02-15 17:08:11.573090282 +0000 UTC m=+147.516498414" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.629082 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.629259 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.129230481 +0000 UTC m=+148.072638603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.629308 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.630835 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.130820711 +0000 UTC m=+148.074228843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.702498 4585 csr.go:261] certificate signing request csr-4stmn is approved, waiting to be issued Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.710149 4585 csr.go:257] certificate signing request csr-4stmn is issued Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.731255 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.732086 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.232071787 +0000 UTC m=+148.175479919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.751090 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:11 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:11 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:11 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.751142 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.838891 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.839243 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.339229005 +0000 UTC m=+148.282637137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:11 crc kubenswrapper[4585]: I0215 17:08:11.939945 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:11 crc kubenswrapper[4585]: E0215 17:08:11.940276 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.440260236 +0000 UTC m=+148.383668368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.041541 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.041862 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.541848081 +0000 UTC m=+148.485256203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.143112 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.143291 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.643267602 +0000 UTC m=+148.586675734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.143451 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.143727 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.643715873 +0000 UTC m=+148.587124005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.244344 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.244688 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.744647782 +0000 UTC m=+148.688055904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.346291 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.346669 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.846651968 +0000 UTC m=+148.790060100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.447391 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.447588 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.947561276 +0000 UTC m=+148.890969408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.447694 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.448012 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:12.948000357 +0000 UTC m=+148.891408489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.536221 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" event={"ID":"a16374fa-419c-416f-86ab-0f18c37da52c","Type":"ContainerStarted","Data":"fa2e851d50794e897b57750311bd4ad19e88f15dc2b685da0ee798943ac759bf"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.537264 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.538781 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" event={"ID":"f55299e6-4571-42b7-b96f-35d8612609d2","Type":"ContainerStarted","Data":"6941ba7ca1b65d4c6926d07efa84cfb3cf3b810c422ed03bf3b684768b1f75e9"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.540302 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jt5m5" event={"ID":"9a11420c-890e-4314-ba16-1867be7c401c","Type":"ContainerStarted","Data":"f3674967d8688aaa26e1232339596878485a452f4a971d680444465e719bcf71"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.540328 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.540338 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jt5m5" event={"ID":"9a11420c-890e-4314-ba16-1867be7c401c","Type":"ContainerStarted","Data":"9cce1d7c0f8c958d6f0dc6b668386540cee98c1f6a8c5070617627976068c50a"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.542153 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" event={"ID":"dd9c9717-dbe1-42fd-9648-e4a8cc4ec882","Type":"ContainerStarted","Data":"91afadd1a1fc1b399fe65b3df55b5276fa0794140331b1f5f1f07bd77457a882"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.542184 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" event={"ID":"dd9c9717-dbe1-42fd-9648-e4a8cc4ec882","Type":"ContainerStarted","Data":"351a5ec2adc5fce19aa45b711f7612763e3fb6f9f63aed598249bba78db45556"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.542333 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.543809 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2" event={"ID":"a1985842-4918-4ebf-ac2a-1a08465d06df","Type":"ContainerStarted","Data":"8e9d99ec1ca3600660f000b3916c9cff66291377b274bc57083d6943548dda67"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.548983 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.549137 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.04911662 +0000 UTC m=+148.992524752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.549301 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.549369 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.549664 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.049652083 +0000 UTC m=+148.993060215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.550039 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" event={"ID":"b64a1a1d-965c-4be7-ad5b-ce4fd8af59c5","Type":"ContainerStarted","Data":"4bafbdb2b5d382003756061f80f3d29605ff01e8b432b7d76f444c3a29e0bb58"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.550448 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.551999 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" event={"ID":"9f407773-408b-4d66-a516-59646429f2fb","Type":"ContainerStarted","Data":"d473bed487214d959fe0eef4259633a78e55d46d0ded09e62f3d100d99a07dd5"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.552032 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" event={"ID":"9f407773-408b-4d66-a516-59646429f2fb","Type":"ContainerStarted","Data":"3f1449e9f4df88cd239f1c237cf17cb310c76a248426abd24f621c01af03473e"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.555301 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" event={"ID":"24e3abe1-a96b-4a1c-8fd7-fa28da78822f","Type":"ContainerStarted","Data":"aeeff2cca86c7407ccf742838b7f4b1b6d5dcc95711ae15262fa3cea8428aa09"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.555349 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" event={"ID":"24e3abe1-a96b-4a1c-8fd7-fa28da78822f","Type":"ContainerStarted","Data":"b65a30b781f94cbf0d17b938ac488202d61c486889260bf47c25bb579538a64f"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.557850 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-924b8" event={"ID":"64ee5f2d-78e3-4c7b-a1ea-1b38d888d043","Type":"ContainerStarted","Data":"91b4bc5070c248dec53a40c58d620a534960a187b2ab50fc4dfb9d2eaa8dcbb2"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.557888 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-924b8" event={"ID":"64ee5f2d-78e3-4c7b-a1ea-1b38d888d043","Type":"ContainerStarted","Data":"7a8bd9a701afd117b45e8c37821a1d85562d7b09e82d9917c20d60ad371a6824"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.559235 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kcvnt" event={"ID":"c5a28822-618f-48f8-bc6d-9f4aa2be4a9f","Type":"ContainerStarted","Data":"bb042e0840d71dc99a64e853d932994b626d276e45162640bb919ac2488433fe"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.564141 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" podStartSLOduration=128.564114581 podStartE2EDuration="2m8.564114581s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.55931238 +0000 UTC m=+148.502720512" watchObservedRunningTime="2026-02-15 17:08:12.564114581 +0000 UTC m=+148.507522743" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.564396 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" event={"ID":"70f4a804-22d1-4cdf-a403-d8e81dc3233e","Type":"ContainerStarted","Data":"5a7ddafb6e9e3d4dedb259d4660f2e54280ad5460926c7e1c6c675b5e64022ee"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.566697 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" event={"ID":"d3c2015e-dd19-4448-ae65-bce4f36d35c5","Type":"ContainerStarted","Data":"f5394f4c029c845ae8e4174989e00b20f4873c2dface42e2d8c52166e79f8231"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.566738 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" event={"ID":"d3c2015e-dd19-4448-ae65-bce4f36d35c5","Type":"ContainerStarted","Data":"d1570703cb1be8c8fd23c7d5214f5e05d0d4ca060f2650f14724672d42af21b6"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.572243 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" event={"ID":"389330df-47c0-4815-9070-2664655acaab","Type":"ContainerStarted","Data":"ec734deb30df146b40d6663ac604a5e81bdc3969c765a326df9f7fdb7abff57c"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.573627 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" event={"ID":"593dc5c4-4044-44c3-bc9c-93b61527da19","Type":"ContainerStarted","Data":"e91ebd2e38b6216e4916095c7932ee095aed2badaf50fd14ecb34fd33b0cd7fa"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.576046 4585 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z5d9n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.576098 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" podUID="bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.576084 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" event={"ID":"3a7b248c-9d3e-4ed9-802e-e381f76846e4","Type":"ContainerStarted","Data":"7e86cc27276a585089157474617ac059b255cdc96582c2f1c743abb4b49cc884"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.576161 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" event={"ID":"3a7b248c-9d3e-4ed9-802e-e381f76846e4","Type":"ContainerStarted","Data":"c2afca1dea13b074f507f278b1bd23fbf00f34c0a8ddc45197a417bf53bf9f21"} Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.577418 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfzm8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.577451 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cfzm8" podUID="149cbe83-6c8e-471a-8534-73fa827b39a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.596962 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.631199 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-924b8" podStartSLOduration=127.631177228 podStartE2EDuration="2m7.631177228s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.624694543 +0000 UTC m=+148.568102675" watchObservedRunningTime="2026-02-15 17:08:12.631177228 +0000 UTC m=+148.574585360" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.633043 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jt5m5" podStartSLOduration=9.633034875 podStartE2EDuration="9.633034875s" podCreationTimestamp="2026-02-15 17:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.591362185 +0000 UTC m=+148.534770337" watchObservedRunningTime="2026-02-15 17:08:12.633034875 +0000 UTC m=+148.576442997" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.637143 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mgxft" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.650810 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.651179 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.651334 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.651476 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.654341 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.154323157 +0000 UTC m=+149.097731289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.669456 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.669456 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.683209 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5td8z" podStartSLOduration=127.683191212 podStartE2EDuration="2m7.683191212s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.68155544 +0000 UTC m=+148.624963572" watchObservedRunningTime="2026-02-15 17:08:12.683191212 +0000 UTC m=+148.626599344" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.683411 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-w5kxz" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.686216 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.712270 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-15 17:03:11 +0000 UTC, rotation deadline is 2026-11-06 21:26:01.226481249 +0000 UTC Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.712310 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6340h17m48.514173856s for next certificate rotation Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.750933 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w62vm" podStartSLOduration=127.750918205 podStartE2EDuration="2m7.750918205s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.749249052 +0000 UTC m=+148.692657184" watchObservedRunningTime="2026-02-15 17:08:12.750918205 +0000 UTC m=+148.694326337" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.754372 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.754741 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.254729182 +0000 UTC m=+149.198137314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.756789 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:12 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:12 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:12 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.756840 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.824550 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" podStartSLOduration=127.824535658 podStartE2EDuration="2m7.824535658s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.816892925 +0000 UTC m=+148.760301057" watchObservedRunningTime="2026-02-15 17:08:12.824535658 +0000 UTC m=+148.767943790" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.855175 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.855185 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.855359 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.355333962 +0000 UTC m=+149.298742094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.855471 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.855826 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.355815044 +0000 UTC m=+149.299223176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.862833 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.868865 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.870814 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" podStartSLOduration=128.870805046 podStartE2EDuration="2m8.870805046s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.870419787 +0000 UTC m=+148.813827919" watchObservedRunningTime="2026-02-15 17:08:12.870805046 +0000 UTC m=+148.814213178" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.921508 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" podStartSLOduration=127.921490176 podStartE2EDuration="2m7.921490176s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.902728458 +0000 UTC m=+148.846136590" watchObservedRunningTime="2026-02-15 17:08:12.921490176 +0000 UTC m=+148.864898308" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.923728 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5m8q2" podStartSLOduration=127.923714693 podStartE2EDuration="2m7.923714693s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.92010496 +0000 UTC m=+148.863513092" watchObservedRunningTime="2026-02-15 17:08:12.923714693 +0000 UTC m=+148.867122825" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.951051 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zfq7z" podStartSLOduration=127.951039018 podStartE2EDuration="2m7.951039018s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:12.948825582 +0000 UTC m=+148.892233714" watchObservedRunningTime="2026-02-15 17:08:12.951039018 +0000 UTC m=+148.894447150" Feb 15 17:08:12 crc kubenswrapper[4585]: I0215 17:08:12.957144 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:12 crc kubenswrapper[4585]: E0215 17:08:12.957452 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.457439131 +0000 UTC m=+149.400847263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.059168 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.059535 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.559521468 +0000 UTC m=+149.502929600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.130463 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hq2n4" podStartSLOduration=128.130447693 podStartE2EDuration="2m8.130447693s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:13.093693918 +0000 UTC m=+149.037102050" watchObservedRunningTime="2026-02-15 17:08:13.130447693 +0000 UTC m=+149.073855825" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.162451 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.162773 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.662759476 +0000 UTC m=+149.606167608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.166102 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vg82t" podStartSLOduration=128.16609076 podStartE2EDuration="2m8.16609076s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:13.132307531 +0000 UTC m=+149.075715663" watchObservedRunningTime="2026-02-15 17:08:13.16609076 +0000 UTC m=+149.109498892" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.262310 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wtss4" podStartSLOduration=128.262285839 podStartE2EDuration="2m8.262285839s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:13.222373412 +0000 UTC m=+149.165781544" watchObservedRunningTime="2026-02-15 17:08:13.262285839 +0000 UTC m=+149.205693971" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.263407 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k9g4d" podStartSLOduration=128.263402247 podStartE2EDuration="2m8.263402247s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:13.260969955 +0000 UTC m=+149.204378087" watchObservedRunningTime="2026-02-15 17:08:13.263402247 +0000 UTC m=+149.206810379" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.263755 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.263998 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.763988302 +0000 UTC m=+149.707396434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.364270 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.364401 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.864381997 +0000 UTC m=+149.807790119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.364546 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.364819 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.864811427 +0000 UTC m=+149.808219559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.468416 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.469037 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:13.969021929 +0000 UTC m=+149.912430061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.504826 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sbsth"] Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.505756 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.513774 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.569814 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-utilities\") pod \"certified-operators-sbsth\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.569873 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-catalog-content\") pod \"certified-operators-sbsth\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.569935 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.569959 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hqn8\" (UniqueName: \"kubernetes.io/projected/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-kube-api-access-7hqn8\") pod \"certified-operators-sbsth\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.570326 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.070314597 +0000 UTC m=+150.013722729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.578990 4585 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jp6zp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": context deadline exceeded" start-of-body= Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.579046 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" podUID="7970ebfe-806a-44c4-9756-6d5bc4903eac" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": context deadline exceeded" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.583360 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" event={"ID":"de36c82b-03fe-4750-95a9-75c7ee2e68bd","Type":"ContainerStarted","Data":"640469d3c4b141e01ee548005bd49aa19de93b0076555b2c7c04ca40ea3eb09d"} Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.637976 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sbsth"] Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.673057 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.673167 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.173153024 +0000 UTC m=+150.116561156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.673395 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-catalog-content\") pod \"certified-operators-sbsth\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.673530 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.673655 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hqn8\" (UniqueName: \"kubernetes.io/projected/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-kube-api-access-7hqn8\") pod \"certified-operators-sbsth\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.673997 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-utilities\") pod \"certified-operators-sbsth\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.674359 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-utilities\") pod \"certified-operators-sbsth\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.677477 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.177460384 +0000 UTC m=+150.120868516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.678779 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-catalog-content\") pod \"certified-operators-sbsth\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.688734 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4c48n"] Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.689658 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.699639 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.710625 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4c48n"] Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.760470 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hqn8\" (UniqueName: \"kubernetes.io/projected/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-kube-api-access-7hqn8\") pod \"certified-operators-sbsth\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.773297 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:13 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:13 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:13 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.773343 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.775977 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.776191 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbvz4\" (UniqueName: \"kubernetes.io/projected/329f56ce-8e35-4eec-adaf-123808e4af4e-kube-api-access-vbvz4\") pod \"community-operators-4c48n\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.776262 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-catalog-content\") pod \"community-operators-4c48n\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.776282 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-utilities\") pod \"community-operators-4c48n\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.776401 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.276386201 +0000 UTC m=+150.219794333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.832863 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.877093 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-catalog-content\") pod \"community-operators-4c48n\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.877127 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-utilities\") pod \"community-operators-4c48n\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.877168 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbvz4\" (UniqueName: \"kubernetes.io/projected/329f56ce-8e35-4eec-adaf-123808e4af4e-kube-api-access-vbvz4\") pod \"community-operators-4c48n\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.877187 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.877451 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.377440693 +0000 UTC m=+150.320848815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.877938 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-catalog-content\") pod \"community-operators-4c48n\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.878138 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-utilities\") pod \"community-operators-4c48n\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.890545 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wlvv9"] Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.891407 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.915024 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlvv9"] Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.920304 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbvz4\" (UniqueName: \"kubernetes.io/projected/329f56ce-8e35-4eec-adaf-123808e4af4e-kube-api-access-vbvz4\") pod \"community-operators-4c48n\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:13 crc kubenswrapper[4585]: I0215 17:08:13.981861 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:13 crc kubenswrapper[4585]: E0215 17:08:13.982538 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.482522257 +0000 UTC m=+150.425930389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.025840 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.083961 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-catalog-content\") pod \"certified-operators-wlvv9\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.084237 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.084258 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvnm2\" (UniqueName: \"kubernetes.io/projected/62ebdb47-d46a-431d-a8ce-b993c2e561dd-kube-api-access-dvnm2\") pod \"certified-operators-wlvv9\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.084275 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-utilities\") pod \"certified-operators-wlvv9\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: E0215 17:08:14.084547 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.584535943 +0000 UTC m=+150.527944075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.096641 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z6jz4"] Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.105554 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.128042 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6jz4"] Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.185570 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.185802 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-catalog-content\") pod \"certified-operators-wlvv9\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.185875 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvnm2\" (UniqueName: \"kubernetes.io/projected/62ebdb47-d46a-431d-a8ce-b993c2e561dd-kube-api-access-dvnm2\") pod \"certified-operators-wlvv9\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.185898 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-utilities\") pod \"certified-operators-wlvv9\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.185925 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-catalog-content\") pod \"community-operators-z6jz4\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.185950 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqdv\" (UniqueName: \"kubernetes.io/projected/4e461961-6ad6-4275-b1e4-dc540e024b8a-kube-api-access-4nqdv\") pod \"community-operators-z6jz4\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.185978 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-utilities\") pod \"community-operators-z6jz4\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: E0215 17:08:14.186120 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.686102267 +0000 UTC m=+150.629510399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.186515 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-catalog-content\") pod \"certified-operators-wlvv9\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.187129 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-utilities\") pod \"certified-operators-wlvv9\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.238424 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvnm2\" (UniqueName: \"kubernetes.io/projected/62ebdb47-d46a-431d-a8ce-b993c2e561dd-kube-api-access-dvnm2\") pod \"certified-operators-wlvv9\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.273057 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.287179 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.287223 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-catalog-content\") pod \"community-operators-z6jz4\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.287242 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nqdv\" (UniqueName: \"kubernetes.io/projected/4e461961-6ad6-4275-b1e4-dc540e024b8a-kube-api-access-4nqdv\") pod \"community-operators-z6jz4\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.287265 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-utilities\") pod \"community-operators-z6jz4\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.287691 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-utilities\") pod \"community-operators-z6jz4\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: E0215 17:08:14.287919 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.787908529 +0000 UTC m=+150.731316661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.288265 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-catalog-content\") pod \"community-operators-z6jz4\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.329679 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nqdv\" (UniqueName: \"kubernetes.io/projected/4e461961-6ad6-4275-b1e4-dc540e024b8a-kube-api-access-4nqdv\") pod \"community-operators-z6jz4\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.388664 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:14 crc kubenswrapper[4585]: E0215 17:08:14.389087 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.889069623 +0000 UTC m=+150.832477755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.440820 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.448129 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.492376 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:14 crc kubenswrapper[4585]: E0215 17:08:14.492697 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:14.99268553 +0000 UTC m=+150.936093652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.587847 4585 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jp6zp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.587909 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" podUID="7970ebfe-806a-44c4-9756-6d5bc4903eac" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.598102 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:14 crc kubenswrapper[4585]: E0215 17:08:14.598766 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:15.098751309 +0000 UTC m=+151.042159441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.640311 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2025ac7096f56e9a61862029ee4768bd2ec40b20b5d29d575cab158d9f3e3185"} Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.710323 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:14 crc kubenswrapper[4585]: E0215 17:08:14.710932 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:15.210921123 +0000 UTC m=+151.154329255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.754672 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:14 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:14 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:14 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.754723 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.813052 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:14 crc kubenswrapper[4585]: E0215 17:08:14.814790 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:15.314774906 +0000 UTC m=+151.258183038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:14 crc kubenswrapper[4585]: I0215 17:08:14.922255 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:14 crc kubenswrapper[4585]: E0215 17:08:14.922579 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:15.422568759 +0000 UTC m=+151.365976881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.022820 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.023097 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:15.523084667 +0000 UTC m=+151.466492789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.036061 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sbsth"] Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.133217 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.133927 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:15.633916338 +0000 UTC m=+151.577324470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.234413 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.234733 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:15.734718813 +0000 UTC m=+151.678126945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.335901 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.336736 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:15.836723728 +0000 UTC m=+151.780131860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.441477 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.441927 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:15.941911426 +0000 UTC m=+151.885319558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.462577 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4c48n"] Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.542682 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.542950 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.042939997 +0000 UTC m=+151.986348129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.641834 4585 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-qdjvg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.642108 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" podUID="a16374fa-419c-416f-86ab-0f18c37da52c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.643461 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.643806 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.143793613 +0000 UTC m=+152.087201745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.696450 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlvv9"] Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.722475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4ec9d44da17ce9fa80de33077790782986f82afa24aa7d3949f71e2a81807cf4"} Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.735640 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6vk82"] Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.736730 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.747834 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.749145 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.749503 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.249488912 +0000 UTC m=+152.192897044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.751305 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:15 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:15 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:15 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.751561 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.757385 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" event={"ID":"de36c82b-03fe-4750-95a9-75c7ee2e68bd","Type":"ContainerStarted","Data":"c1a87ca861e33cf21fbadd3e2ae86b8c8538f0a58d62d267d90f6512c902532b"} Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.766403 4585 generic.go:334] "Generic (PLEG): container finished" podID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerID="a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f" exitCode=0 Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.769785 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbsth" event={"ID":"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4","Type":"ContainerDied","Data":"a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f"} Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.769827 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbsth" event={"ID":"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4","Type":"ContainerStarted","Data":"361c4f68b6ebbd666e53f0e075a875c539c873dd7d8540aa501673eebfb74c25"} Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.784200 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"59506da03ad8948f75773aee3684905f477e87980df0b64a4cee39a2e7c2ce68"} Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.785170 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.785694 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.799920 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c48n" event={"ID":"329f56ce-8e35-4eec-adaf-123808e4af4e","Type":"ContainerStarted","Data":"e96e78abf43ca0dc906cdb3f3be4dc4a1aa3ea23fcc46744b884a2315d0a26df"} Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.820798 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b65c6c5ebbd96dd68da6c2e6cb82a254bede456b6a3e106438418488084b60d9"} Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.820841 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"86355fea99414540e9749bc2daf07ad67fff8f9e86c789e0c9083c869242361c"} Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.830326 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6jz4"] Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.851296 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.851481 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-utilities\") pod \"redhat-marketplace-6vk82\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.851551 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.351522479 +0000 UTC m=+152.294930611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.851678 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-catalog-content\") pod \"redhat-marketplace-6vk82\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.851749 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.851827 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lr2c\" (UniqueName: \"kubernetes.io/projected/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-kube-api-access-5lr2c\") pod \"redhat-marketplace-6vk82\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.853164 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.35315228 +0000 UTC m=+152.296560412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.879690 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vk82"] Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.956202 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.956328 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.456306175 +0000 UTC m=+152.399714307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.956384 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-catalog-content\") pod \"redhat-marketplace-6vk82\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.956417 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.956454 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lr2c\" (UniqueName: \"kubernetes.io/projected/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-kube-api-access-5lr2c\") pod \"redhat-marketplace-6vk82\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.956542 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-utilities\") pod \"redhat-marketplace-6vk82\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.956788 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-catalog-content\") pod \"redhat-marketplace-6vk82\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:15 crc kubenswrapper[4585]: I0215 17:08:15.956878 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-utilities\") pod \"redhat-marketplace-6vk82\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:15 crc kubenswrapper[4585]: E0215 17:08:15.957212 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.457205079 +0000 UTC m=+152.400613211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.007909 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lr2c\" (UniqueName: \"kubernetes.io/projected/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-kube-api-access-5lr2c\") pod \"redhat-marketplace-6vk82\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.057075 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.057418 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.557402558 +0000 UTC m=+152.500810690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.071284 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4x5fb"] Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.073343 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.107068 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.145824 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x5fb"] Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.159734 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6cdl\" (UniqueName: \"kubernetes.io/projected/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-kube-api-access-d6cdl\") pod \"redhat-marketplace-4x5fb\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.159827 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-catalog-content\") pod \"redhat-marketplace-4x5fb\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.159866 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.159898 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-utilities\") pod \"redhat-marketplace-4x5fb\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.160222 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.660210265 +0000 UTC m=+152.603618397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.264334 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.264533 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.764505198 +0000 UTC m=+152.707913330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.264974 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-catalog-content\") pod \"redhat-marketplace-4x5fb\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.265012 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.265056 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-utilities\") pod \"redhat-marketplace-4x5fb\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.265104 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6cdl\" (UniqueName: \"kubernetes.io/projected/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-kube-api-access-d6cdl\") pod \"redhat-marketplace-4x5fb\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.265884 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-catalog-content\") pod \"redhat-marketplace-4x5fb\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.266144 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.76613292 +0000 UTC m=+152.709541052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.266465 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-utilities\") pod \"redhat-marketplace-4x5fb\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.308095 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6cdl\" (UniqueName: \"kubernetes.io/projected/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-kube-api-access-d6cdl\") pod \"redhat-marketplace-4x5fb\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.347696 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.347736 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.350712 4585 patch_prober.go:28] interesting pod/console-f9d7485db-gfx64 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.350763 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gfx64" podUID="7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.365920 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.366160 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.866134324 +0000 UTC m=+152.809542456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.366276 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.366609 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.866581447 +0000 UTC m=+152.809989579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.427029 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.440662 4585 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.444785 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.445110 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.452748 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.452779 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.458818 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.467914 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.469229 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:16.969207128 +0000 UTC m=+152.912615260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.487152 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfzm8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.487411 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cfzm8" podUID="149cbe83-6c8e-471a-8534-73fa827b39a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.488701 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfzm8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.488795 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cfzm8" podUID="149cbe83-6c8e-471a-8534-73fa827b39a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.576930 4585 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-qdjvg container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.577248 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" podUID="a16374fa-419c-416f-86ab-0f18c37da52c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.578361 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.580264 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:17.080250094 +0000 UTC m=+153.023658226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.642785 4585 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-qdjvg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.642852 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" podUID="a16374fa-419c-416f-86ab-0f18c37da52c" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.657655 4585 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gjvln container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]log ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]etcd ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/generic-apiserver-start-informers ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/max-in-flight-filter ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 15 17:08:16 crc kubenswrapper[4585]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 15 17:08:16 crc kubenswrapper[4585]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/project.openshift.io-projectcache ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/openshift.io-startinformers ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 15 17:08:16 crc kubenswrapper[4585]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 15 17:08:16 crc kubenswrapper[4585]: livez check failed Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.657724 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" podUID="9f407773-408b-4d66-a516-59646429f2fb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.675019 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8nshz"] Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.676034 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.678681 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.684130 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.684389 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:17.184355643 +0000 UTC m=+153.127763775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.684544 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.684930 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:17.184923187 +0000 UTC m=+153.128331319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.739373 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8nshz"] Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.747656 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.753715 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:16 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:16 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:16 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.753777 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.785206 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.785345 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gwvw\" (UniqueName: \"kubernetes.io/projected/76a47487-6876-4c12-9b15-d8594ad9d748-kube-api-access-9gwvw\") pod \"redhat-operators-8nshz\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.785365 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-utilities\") pod \"redhat-operators-8nshz\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.785444 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-catalog-content\") pod \"redhat-operators-8nshz\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.785539 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:17.285525637 +0000 UTC m=+153.228933769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.821125 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x5fb"] Feb 15 17:08:16 crc kubenswrapper[4585]: W0215 17:08:16.867798 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0804c7_219d_4c0c_ae97_f7e8ebac6895.slice/crio-53f3a29a3b875d46e6594ffc5cedce52903e2a1c8c8028b49900eb6e934952ea WatchSource:0}: Error finding container 53f3a29a3b875d46e6594ffc5cedce52903e2a1c8c8028b49900eb6e934952ea: Status 404 returned error can't find the container with id 53f3a29a3b875d46e6594ffc5cedce52903e2a1c8c8028b49900eb6e934952ea Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.873171 4585 generic.go:334] "Generic (PLEG): container finished" podID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerID="6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d" exitCode=0 Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.887366 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-catalog-content\") pod \"redhat-operators-8nshz\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.887404 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.887426 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gwvw\" (UniqueName: \"kubernetes.io/projected/76a47487-6876-4c12-9b15-d8594ad9d748-kube-api-access-9gwvw\") pod \"redhat-operators-8nshz\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.887444 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-utilities\") pod \"redhat-operators-8nshz\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.887489 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6jz4" event={"ID":"4e461961-6ad6-4275-b1e4-dc540e024b8a","Type":"ContainerDied","Data":"6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d"} Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.887527 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6jz4" event={"ID":"4e461961-6ad6-4275-b1e4-dc540e024b8a","Type":"ContainerStarted","Data":"8cba02e6c0a49ad15f139df883b59b6da57da02d59900e0891a9e7de2ae79599"} Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.887561 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.887789 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-utilities\") pod \"redhat-operators-8nshz\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.888451 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-catalog-content\") pod \"redhat-operators-8nshz\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.888730 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:17.388718324 +0000 UTC m=+153.332126456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.888923 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jp6zp" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.899494 4585 generic.go:334] "Generic (PLEG): container finished" podID="b6b450dc-9948-4b88-b099-3d1aebf653d3" containerID="7d44e65eb96a39c541d4c38939ecc3f089cbbd92469a65573e57d7d52f200d86" exitCode=0 Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.899550 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" event={"ID":"b6b450dc-9948-4b88-b099-3d1aebf653d3","Type":"ContainerDied","Data":"7d44e65eb96a39c541d4c38939ecc3f089cbbd92469a65573e57d7d52f200d86"} Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.940765 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gwvw\" (UniqueName: \"kubernetes.io/projected/76a47487-6876-4c12-9b15-d8594ad9d748-kube-api-access-9gwvw\") pod \"redhat-operators-8nshz\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.949698 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" event={"ID":"de36c82b-03fe-4750-95a9-75c7ee2e68bd","Type":"ContainerStarted","Data":"6d80c7dac55838921397c3b0cf032447fec1a524d2c20811b72447542eaec91f"} Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.949752 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" event={"ID":"de36c82b-03fe-4750-95a9-75c7ee2e68bd","Type":"ContainerStarted","Data":"a10f69721c527ad01073c0cbf33dd1f4d6464c9516aacb557cd94dee00b49464"} Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.979330 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vk82"] Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.986071 4585 generic.go:334] "Generic (PLEG): container finished" podID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerID="f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272" exitCode=0 Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.986259 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlvv9" event={"ID":"62ebdb47-d46a-431d-a8ce-b993c2e561dd","Type":"ContainerDied","Data":"f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272"} Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.986335 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlvv9" event={"ID":"62ebdb47-d46a-431d-a8ce-b993c2e561dd","Type":"ContainerStarted","Data":"d44fcb7e52e8f11311184be534a6e2e11802052c34b22fd1f28d1d1e1c858f79"} Feb 15 17:08:16 crc kubenswrapper[4585]: I0215 17:08:16.988137 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:16 crc kubenswrapper[4585]: E0215 17:08:16.989146 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:17.489130479 +0000 UTC m=+153.432538611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.005530 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.009597 4585 generic.go:334] "Generic (PLEG): container finished" podID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerID="bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48" exitCode=0 Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.009693 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c48n" event={"ID":"329f56ce-8e35-4eec-adaf-123808e4af4e","Type":"ContainerDied","Data":"bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48"} Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.014133 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.014166 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.064706 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b43326a30129ad4bb46e1243c790b488074b7477a879d06225edd286d1eff4cb"} Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.080870 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fg58h" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.090332 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:17 crc kubenswrapper[4585]: E0215 17:08:17.091097 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:17.591082893 +0000 UTC m=+153.534491025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.092752 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jsq6s"] Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.097898 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.100755 4585 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-15T17:08:16.440683492Z","Handler":null,"Name":""} Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.126579 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jsq6s"] Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.128660 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-6vf5k" podStartSLOduration=14.128643239 podStartE2EDuration="14.128643239s" podCreationTimestamp="2026-02-15 17:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:17.121260481 +0000 UTC m=+153.064668613" watchObservedRunningTime="2026-02-15 17:08:17.128643239 +0000 UTC m=+153.072051371" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.145852 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.148294 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.185013 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.185411 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.198272 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:17 crc kubenswrapper[4585]: E0215 17:08:17.198914 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-15 17:08:17.698875467 +0000 UTC m=+153.642283599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.202345 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-utilities\") pod \"redhat-operators-jsq6s\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.202474 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.202641 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8pk\" (UniqueName: \"kubernetes.io/projected/b27ca682-1703-426b-b644-3f28226eda98-kube-api-access-pb8pk\") pod \"redhat-operators-jsq6s\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.202757 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-catalog-content\") pod \"redhat-operators-jsq6s\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: E0215 17:08:17.209520 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-15 17:08:17.709496517 +0000 UTC m=+153.652904649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-st5w4" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.210693 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.272374 4585 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.272424 4585 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.304472 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.304662 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-utilities\") pod \"redhat-operators-jsq6s\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.304710 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.304744 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8pk\" (UniqueName: \"kubernetes.io/projected/b27ca682-1703-426b-b644-3f28226eda98-kube-api-access-pb8pk\") pod \"redhat-operators-jsq6s\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.304786 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-catalog-content\") pod \"redhat-operators-jsq6s\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.304858 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.305352 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-utilities\") pod \"redhat-operators-jsq6s\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.305621 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-catalog-content\") pod \"redhat-operators-jsq6s\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.322218 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.340290 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8pk\" (UniqueName: \"kubernetes.io/projected/b27ca682-1703-426b-b644-3f28226eda98-kube-api-access-pb8pk\") pod \"redhat-operators-jsq6s\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.405982 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.406051 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.406078 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.406199 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.435256 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.436439 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.495324 4585 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.495360 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.511703 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.564792 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-st5w4\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.613747 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8nshz"] Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.755093 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:17 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:17 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:17 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.755134 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.773040 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:17 crc kubenswrapper[4585]: I0215 17:08:17.857855 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jsq6s"] Feb 15 17:08:17 crc kubenswrapper[4585]: W0215 17:08:17.918159 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb27ca682_1703_426b_b644_3f28226eda98.slice/crio-b263e182aa1c8e85c4cd5bbb5fbe190e2a94661057126ad9d10a4f61e0106815 WatchSource:0}: Error finding container b263e182aa1c8e85c4cd5bbb5fbe190e2a94661057126ad9d10a4f61e0106815: Status 404 returned error can't find the container with id b263e182aa1c8e85c4cd5bbb5fbe190e2a94661057126ad9d10a4f61e0106815 Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.122250 4585 generic.go:334] "Generic (PLEG): container finished" podID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerID="d7d3319c4e54810987b9dab93efa93d67dc5f53c5e9f4492a39b4f603aa52597" exitCode=0 Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.122337 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x5fb" event={"ID":"9d0804c7-219d-4c0c-ae97-f7e8ebac6895","Type":"ContainerDied","Data":"d7d3319c4e54810987b9dab93efa93d67dc5f53c5e9f4492a39b4f603aa52597"} Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.122380 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x5fb" event={"ID":"9d0804c7-219d-4c0c-ae97-f7e8ebac6895","Type":"ContainerStarted","Data":"53f3a29a3b875d46e6594ffc5cedce52903e2a1c8c8028b49900eb6e934952ea"} Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.126738 4585 generic.go:334] "Generic (PLEG): container finished" podID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerID="1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d" exitCode=0 Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.127397 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vk82" event={"ID":"d3beabcf-c3ca-49e7-a5e5-5719f184fab7","Type":"ContainerDied","Data":"1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d"} Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.127421 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vk82" event={"ID":"d3beabcf-c3ca-49e7-a5e5-5719f184fab7","Type":"ContainerStarted","Data":"613aa137e89e3a7d4b111cd788b87748782a516a498dae72ab1c4bf691058559"} Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.129989 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsq6s" event={"ID":"b27ca682-1703-426b-b644-3f28226eda98","Type":"ContainerStarted","Data":"b263e182aa1c8e85c4cd5bbb5fbe190e2a94661057126ad9d10a4f61e0106815"} Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.132053 4585 generic.go:334] "Generic (PLEG): container finished" podID="76a47487-6876-4c12-9b15-d8594ad9d748" containerID="1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946" exitCode=0 Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.133008 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nshz" event={"ID":"76a47487-6876-4c12-9b15-d8594ad9d748","Type":"ContainerDied","Data":"1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946"} Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.133031 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nshz" event={"ID":"76a47487-6876-4c12-9b15-d8594ad9d748","Type":"ContainerStarted","Data":"094f855cd11a8fc900a090132f3168ee1356d52dd22272a3d0dcb87f557cc045"} Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.231687 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 15 17:08:18 crc kubenswrapper[4585]: W0215 17:08:18.256255 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4df3c01d_9cdf_4d82_a17e_5cd1c9e11e82.slice/crio-d622e22e27bef4956f17209992d5571dacf9f7c4d94162569e234102001f769e WatchSource:0}: Error finding container d622e22e27bef4956f17209992d5571dacf9f7c4d94162569e234102001f769e: Status 404 returned error can't find the container with id d622e22e27bef4956f17209992d5571dacf9f7c4d94162569e234102001f769e Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.340052 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-st5w4"] Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.566065 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.581333 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qdjvg" Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.662559 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b450dc-9948-4b88-b099-3d1aebf653d3-config-volume\") pod \"b6b450dc-9948-4b88-b099-3d1aebf653d3\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.662953 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpgcc\" (UniqueName: \"kubernetes.io/projected/b6b450dc-9948-4b88-b099-3d1aebf653d3-kube-api-access-qpgcc\") pod \"b6b450dc-9948-4b88-b099-3d1aebf653d3\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.663021 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b450dc-9948-4b88-b099-3d1aebf653d3-secret-volume\") pod \"b6b450dc-9948-4b88-b099-3d1aebf653d3\" (UID: \"b6b450dc-9948-4b88-b099-3d1aebf653d3\") " Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.664342 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b450dc-9948-4b88-b099-3d1aebf653d3-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6b450dc-9948-4b88-b099-3d1aebf653d3" (UID: "b6b450dc-9948-4b88-b099-3d1aebf653d3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.678748 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b450dc-9948-4b88-b099-3d1aebf653d3-kube-api-access-qpgcc" (OuterVolumeSpecName: "kube-api-access-qpgcc") pod "b6b450dc-9948-4b88-b099-3d1aebf653d3" (UID: "b6b450dc-9948-4b88-b099-3d1aebf653d3"). InnerVolumeSpecName "kube-api-access-qpgcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.679511 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b450dc-9948-4b88-b099-3d1aebf653d3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6b450dc-9948-4b88-b099-3d1aebf653d3" (UID: "b6b450dc-9948-4b88-b099-3d1aebf653d3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.755446 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:18 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:18 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:18 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.755505 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.764953 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpgcc\" (UniqueName: \"kubernetes.io/projected/b6b450dc-9948-4b88-b099-3d1aebf653d3-kube-api-access-qpgcc\") on node \"crc\" DevicePath \"\"" Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.764981 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6b450dc-9948-4b88-b099-3d1aebf653d3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.764989 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6b450dc-9948-4b88-b099-3d1aebf653d3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 15 17:08:18 crc kubenswrapper[4585]: I0215 17:08:18.914719 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.154680 4585 generic.go:334] "Generic (PLEG): container finished" podID="b27ca682-1703-426b-b644-3f28226eda98" containerID="ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4" exitCode=0 Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.154748 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsq6s" event={"ID":"b27ca682-1703-426b-b644-3f28226eda98","Type":"ContainerDied","Data":"ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4"} Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.199187 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82","Type":"ContainerStarted","Data":"d622e22e27bef4956f17209992d5571dacf9f7c4d94162569e234102001f769e"} Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.218006 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" event={"ID":"23788c17-8897-4c56-b718-ebf061e5e15c","Type":"ContainerStarted","Data":"58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219"} Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.218058 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" event={"ID":"23788c17-8897-4c56-b718-ebf061e5e15c","Type":"ContainerStarted","Data":"d02fd00030cadd18c165fc74fd701cc2ab67d99f991e7cb88ea5d597f2a7b0d0"} Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.218108 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.225308 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.225349 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws" event={"ID":"b6b450dc-9948-4b88-b099-3d1aebf653d3","Type":"ContainerDied","Data":"8b941bd3cb337caa0fb12b4a29a07f78f10cfd0cbc40a7130b0812a4e94946d5"} Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.225369 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b941bd3cb337caa0fb12b4a29a07f78f10cfd0cbc40a7130b0812a4e94946d5" Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.751445 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:19 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:19 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:19 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:19 crc kubenswrapper[4585]: I0215 17:08:19.751861 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:20 crc kubenswrapper[4585]: I0215 17:08:20.232428 4585 generic.go:334] "Generic (PLEG): container finished" podID="4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82" containerID="abe0b99ff431d03a5738aaa093780c6f9528d908b1b22964a60f6f6553a64c6e" exitCode=0 Feb 15 17:08:20 crc kubenswrapper[4585]: I0215 17:08:20.232569 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82","Type":"ContainerDied","Data":"abe0b99ff431d03a5738aaa093780c6f9528d908b1b22964a60f6f6553a64c6e"} Feb 15 17:08:20 crc kubenswrapper[4585]: I0215 17:08:20.294097 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" podStartSLOduration=135.294075103 podStartE2EDuration="2m15.294075103s" podCreationTimestamp="2026-02-15 17:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:19.255206896 +0000 UTC m=+155.198615028" watchObservedRunningTime="2026-02-15 17:08:20.294075103 +0000 UTC m=+156.237483265" Feb 15 17:08:20 crc kubenswrapper[4585]: I0215 17:08:20.807872 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:20 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:20 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:20 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:20 crc kubenswrapper[4585]: I0215 17:08:20.807951 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:21 crc kubenswrapper[4585]: I0215 17:08:21.453653 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:21 crc kubenswrapper[4585]: I0215 17:08:21.459875 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gjvln" Feb 15 17:08:21 crc kubenswrapper[4585]: I0215 17:08:21.756290 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:21 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:21 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:21 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:21 crc kubenswrapper[4585]: I0215 17:08:21.756541 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:21 crc kubenswrapper[4585]: I0215 17:08:21.837829 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:21 crc kubenswrapper[4585]: I0215 17:08:21.911078 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jt5m5" Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.011220 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kubelet-dir\") pod \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\" (UID: \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\") " Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.011390 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kube-api-access\") pod \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\" (UID: \"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82\") " Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.011565 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82" (UID: "4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.011845 4585 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.031770 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82" (UID: "4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.121628 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.279950 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.281055 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82","Type":"ContainerDied","Data":"d622e22e27bef4956f17209992d5571dacf9f7c4d94162569e234102001f769e"} Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.281145 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d622e22e27bef4956f17209992d5571dacf9f7c4d94162569e234102001f769e" Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.751981 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:22 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:22 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:22 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:22 crc kubenswrapper[4585]: I0215 17:08:22.752320 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.750662 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:23 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:23 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:23 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.750744 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.787978 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 15 17:08:23 crc kubenswrapper[4585]: E0215 17:08:23.788217 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82" containerName="pruner" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.788228 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82" containerName="pruner" Feb 15 17:08:23 crc kubenswrapper[4585]: E0215 17:08:23.788238 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b450dc-9948-4b88-b099-3d1aebf653d3" containerName="collect-profiles" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.788245 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b450dc-9948-4b88-b099-3d1aebf653d3" containerName="collect-profiles" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.788330 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b450dc-9948-4b88-b099-3d1aebf653d3" containerName="collect-profiles" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.788340 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="4df3c01d-9cdf-4d82-a17e-5cd1c9e11e82" containerName="pruner" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.790890 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.794918 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.797897 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.800219 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.948975 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:23 crc kubenswrapper[4585]: I0215 17:08:23.949045 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:24 crc kubenswrapper[4585]: I0215 17:08:24.052043 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:24 crc kubenswrapper[4585]: I0215 17:08:24.052988 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:24 crc kubenswrapper[4585]: I0215 17:08:24.053075 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:24 crc kubenswrapper[4585]: I0215 17:08:24.072266 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:24 crc kubenswrapper[4585]: I0215 17:08:24.114839 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:24 crc kubenswrapper[4585]: I0215 17:08:24.751009 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:24 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:24 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:24 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:24 crc kubenswrapper[4585]: I0215 17:08:24.751325 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:25 crc kubenswrapper[4585]: I0215 17:08:25.030061 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 15 17:08:25 crc kubenswrapper[4585]: W0215 17:08:25.114903 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod600e065d_f8eb_4eee_b24a_3fcb1a8c3cfd.slice/crio-8d6ff679f0e839708ff0862b31c10831dde0f63d6ce5c46e1c2ff38c72fde200 WatchSource:0}: Error finding container 8d6ff679f0e839708ff0862b31c10831dde0f63d6ce5c46e1c2ff38c72fde200: Status 404 returned error can't find the container with id 8d6ff679f0e839708ff0862b31c10831dde0f63d6ce5c46e1c2ff38c72fde200 Feb 15 17:08:25 crc kubenswrapper[4585]: I0215 17:08:25.380084 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd","Type":"ContainerStarted","Data":"8d6ff679f0e839708ff0862b31c10831dde0f63d6ce5c46e1c2ff38c72fde200"} Feb 15 17:08:25 crc kubenswrapper[4585]: I0215 17:08:25.760064 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:25 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:25 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:25 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:25 crc kubenswrapper[4585]: I0215 17:08:25.760122 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:26 crc kubenswrapper[4585]: I0215 17:08:26.347206 4585 patch_prober.go:28] interesting pod/console-f9d7485db-gfx64 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 15 17:08:26 crc kubenswrapper[4585]: I0215 17:08:26.347457 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-gfx64" podUID="7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 15 17:08:26 crc kubenswrapper[4585]: I0215 17:08:26.487267 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfzm8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 15 17:08:26 crc kubenswrapper[4585]: I0215 17:08:26.487326 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cfzm8" podUID="149cbe83-6c8e-471a-8534-73fa827b39a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 15 17:08:26 crc kubenswrapper[4585]: I0215 17:08:26.487642 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-cfzm8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 15 17:08:26 crc kubenswrapper[4585]: I0215 17:08:26.487691 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cfzm8" podUID="149cbe83-6c8e-471a-8534-73fa827b39a6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.18:8080/\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 15 17:08:26 crc kubenswrapper[4585]: I0215 17:08:26.758294 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:26 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:26 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:26 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:26 crc kubenswrapper[4585]: I0215 17:08:26.758339 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:27 crc kubenswrapper[4585]: I0215 17:08:27.420004 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd","Type":"ContainerStarted","Data":"f2ce915adfb593ebabb9ac2041799edc78fbd7adaa6830fec4272ce03fe13902"} Feb 15 17:08:27 crc kubenswrapper[4585]: I0215 17:08:27.439879 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.439864618 podStartE2EDuration="4.439864618s" podCreationTimestamp="2026-02-15 17:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:27.4336375 +0000 UTC m=+163.377045632" watchObservedRunningTime="2026-02-15 17:08:27.439864618 +0000 UTC m=+163.383272750" Feb 15 17:08:27 crc kubenswrapper[4585]: I0215 17:08:27.750025 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:27 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:27 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:27 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:27 crc kubenswrapper[4585]: I0215 17:08:27.750353 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:28 crc kubenswrapper[4585]: I0215 17:08:28.758417 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:28 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:28 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:28 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:28 crc kubenswrapper[4585]: I0215 17:08:28.758459 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:28 crc kubenswrapper[4585]: I0215 17:08:28.762145 4585 generic.go:334] "Generic (PLEG): container finished" podID="600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd" containerID="f2ce915adfb593ebabb9ac2041799edc78fbd7adaa6830fec4272ce03fe13902" exitCode=0 Feb 15 17:08:28 crc kubenswrapper[4585]: I0215 17:08:28.762173 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd","Type":"ContainerDied","Data":"f2ce915adfb593ebabb9ac2041799edc78fbd7adaa6830fec4272ce03fe13902"} Feb 15 17:08:28 crc kubenswrapper[4585]: I0215 17:08:28.859782 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:08:28 crc kubenswrapper[4585]: I0215 17:08:28.869655 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2e2535-c7ad-42e7-930b-8e0471dfca11-metrics-certs\") pod \"network-metrics-daemon-gclkf\" (UID: \"ee2e2535-c7ad-42e7-930b-8e0471dfca11\") " pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:08:29 crc kubenswrapper[4585]: I0215 17:08:29.168313 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gclkf" Feb 15 17:08:29 crc kubenswrapper[4585]: I0215 17:08:29.750730 4585 patch_prober.go:28] interesting pod/router-default-5444994796-nl8js container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 15 17:08:29 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Feb 15 17:08:29 crc kubenswrapper[4585]: [+]process-running ok Feb 15 17:08:29 crc kubenswrapper[4585]: healthz check failed Feb 15 17:08:29 crc kubenswrapper[4585]: I0215 17:08:29.750807 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nl8js" podUID="a3402525-92f4-4cf0-9fee-43faccdc51bd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:08:30 crc kubenswrapper[4585]: I0215 17:08:30.749770 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:30 crc kubenswrapper[4585]: I0215 17:08:30.752930 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nl8js" Feb 15 17:08:36 crc kubenswrapper[4585]: I0215 17:08:36.394787 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:36 crc kubenswrapper[4585]: I0215 17:08:36.413329 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:08:36 crc kubenswrapper[4585]: I0215 17:08:36.500768 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-cfzm8" Feb 15 17:08:37 crc kubenswrapper[4585]: I0215 17:08:37.781017 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.569158 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.654979 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kube-api-access\") pod \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\" (UID: \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\") " Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.655113 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kubelet-dir\") pod \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\" (UID: \"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd\") " Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.655172 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd" (UID: "600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.655477 4585 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.668656 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd" (UID: "600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.756367 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.947830 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd","Type":"ContainerDied","Data":"8d6ff679f0e839708ff0862b31c10831dde0f63d6ce5c46e1c2ff38c72fde200"} Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.948137 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d6ff679f0e839708ff0862b31c10831dde0f63d6ce5c46e1c2ff38c72fde200" Feb 15 17:08:43 crc kubenswrapper[4585]: I0215 17:08:43.947877 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 15 17:08:46 crc kubenswrapper[4585]: I0215 17:08:46.890788 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-drg89" Feb 15 17:08:47 crc kubenswrapper[4585]: I0215 17:08:47.014350 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:08:47 crc kubenswrapper[4585]: I0215 17:08:47.014408 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:08:52 crc kubenswrapper[4585]: I0215 17:08:52.129651 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ncfjb"] Feb 15 17:08:52 crc kubenswrapper[4585]: E0215 17:08:52.358113 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 15 17:08:52 crc kubenswrapper[4585]: E0215 17:08:52.358414 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pb8pk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jsq6s_openshift-marketplace(b27ca682-1703-426b-b644-3f28226eda98): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 15 17:08:52 crc kubenswrapper[4585]: E0215 17:08:52.359758 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jsq6s" podUID="b27ca682-1703-426b-b644-3f28226eda98" Feb 15 17:08:52 crc kubenswrapper[4585]: I0215 17:08:52.876208 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.474938 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jsq6s" podUID="b27ca682-1703-426b-b644-3f28226eda98" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.588267 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.588449 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4nqdv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-z6jz4_openshift-marketplace(4e461961-6ad6-4275-b1e4-dc540e024b8a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.593516 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-z6jz4" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.655075 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.655396 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vbvz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-4c48n_openshift-marketplace(329f56ce-8e35-4eec-adaf-123808e4af4e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.659962 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-4c48n" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.698947 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.699239 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9gwvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8nshz_openshift-marketplace(76a47487-6876-4c12-9b15-d8594ad9d748): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 15 17:08:54 crc kubenswrapper[4585]: E0215 17:08:54.700416 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8nshz" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" Feb 15 17:08:55 crc kubenswrapper[4585]: I0215 17:08:55.017382 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gclkf"] Feb 15 17:08:55 crc kubenswrapper[4585]: I0215 17:08:55.020958 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x5fb" event={"ID":"9d0804c7-219d-4c0c-ae97-f7e8ebac6895","Type":"ContainerStarted","Data":"4c51804d57c5870f51221a309a22d422b26a4f20e88af1db3f4eddc7ba282a2f"} Feb 15 17:08:55 crc kubenswrapper[4585]: I0215 17:08:55.026664 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vk82" event={"ID":"d3beabcf-c3ca-49e7-a5e5-5719f184fab7","Type":"ContainerStarted","Data":"7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1"} Feb 15 17:08:55 crc kubenswrapper[4585]: I0215 17:08:55.046940 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlvv9" event={"ID":"62ebdb47-d46a-431d-a8ce-b993c2e561dd","Type":"ContainerStarted","Data":"15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17"} Feb 15 17:08:55 crc kubenswrapper[4585]: I0215 17:08:55.050797 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbsth" event={"ID":"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4","Type":"ContainerStarted","Data":"96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089"} Feb 15 17:08:55 crc kubenswrapper[4585]: E0215 17:08:55.052477 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-z6jz4" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" Feb 15 17:08:55 crc kubenswrapper[4585]: E0215 17:08:55.052695 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-4c48n" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" Feb 15 17:08:55 crc kubenswrapper[4585]: E0215 17:08:55.052713 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8nshz" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.057736 4585 generic.go:334] "Generic (PLEG): container finished" podID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerID="15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17" exitCode=0 Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.058011 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlvv9" event={"ID":"62ebdb47-d46a-431d-a8ce-b993c2e561dd","Type":"ContainerDied","Data":"15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17"} Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.061474 4585 generic.go:334] "Generic (PLEG): container finished" podID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerID="96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089" exitCode=0 Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.061548 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbsth" event={"ID":"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4","Type":"ContainerDied","Data":"96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089"} Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.064987 4585 generic.go:334] "Generic (PLEG): container finished" podID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerID="4c51804d57c5870f51221a309a22d422b26a4f20e88af1db3f4eddc7ba282a2f" exitCode=0 Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.065051 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x5fb" event={"ID":"9d0804c7-219d-4c0c-ae97-f7e8ebac6895","Type":"ContainerDied","Data":"4c51804d57c5870f51221a309a22d422b26a4f20e88af1db3f4eddc7ba282a2f"} Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.078287 4585 generic.go:334] "Generic (PLEG): container finished" podID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerID="7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1" exitCode=0 Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.078348 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vk82" event={"ID":"d3beabcf-c3ca-49e7-a5e5-5719f184fab7","Type":"ContainerDied","Data":"7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1"} Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.085704 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gclkf" event={"ID":"ee2e2535-c7ad-42e7-930b-8e0471dfca11","Type":"ContainerStarted","Data":"422dd1c441b05572bd3bf754bf1ed389c05d4f5b9aed40f3e057156e63e42141"} Feb 15 17:08:56 crc kubenswrapper[4585]: I0215 17:08:56.085733 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gclkf" event={"ID":"ee2e2535-c7ad-42e7-930b-8e0471dfca11","Type":"ContainerStarted","Data":"2937af630b819d6779c5b9bf0a932acc37191080029fe5a00b990c6a6a43b88a"} Feb 15 17:08:57 crc kubenswrapper[4585]: I0215 17:08:57.092836 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gclkf" event={"ID":"ee2e2535-c7ad-42e7-930b-8e0471dfca11","Type":"ContainerStarted","Data":"ed8d79bd7de657a15035cdc261dff7ee8debfcb4bafdc127d2cf6f39d53ec647"} Feb 15 17:08:57 crc kubenswrapper[4585]: I0215 17:08:57.116936 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gclkf" podStartSLOduration=173.116912614 podStartE2EDuration="2m53.116912614s" podCreationTimestamp="2026-02-15 17:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:08:57.112982194 +0000 UTC m=+193.056390326" watchObservedRunningTime="2026-02-15 17:08:57.116912614 +0000 UTC m=+193.060320746" Feb 15 17:08:58 crc kubenswrapper[4585]: I0215 17:08:58.101260 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlvv9" event={"ID":"62ebdb47-d46a-431d-a8ce-b993c2e561dd","Type":"ContainerStarted","Data":"900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c"} Feb 15 17:08:58 crc kubenswrapper[4585]: I0215 17:08:58.103992 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbsth" event={"ID":"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4","Type":"ContainerStarted","Data":"ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00"} Feb 15 17:08:58 crc kubenswrapper[4585]: I0215 17:08:58.109143 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x5fb" event={"ID":"9d0804c7-219d-4c0c-ae97-f7e8ebac6895","Type":"ContainerStarted","Data":"09e049094260eaeab6acb7a30dc8b52d299466cafe81865106062dcf3fc06da9"} Feb 15 17:08:58 crc kubenswrapper[4585]: I0215 17:08:58.113054 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vk82" event={"ID":"d3beabcf-c3ca-49e7-a5e5-5719f184fab7","Type":"ContainerStarted","Data":"b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a"} Feb 15 17:08:58 crc kubenswrapper[4585]: I0215 17:08:58.132259 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wlvv9" podStartSLOduration=4.390291567 podStartE2EDuration="45.132239562s" podCreationTimestamp="2026-02-15 17:08:13 +0000 UTC" firstStartedPulling="2026-02-15 17:08:17.029186508 +0000 UTC m=+152.972594640" lastFinishedPulling="2026-02-15 17:08:57.771134503 +0000 UTC m=+193.714542635" observedRunningTime="2026-02-15 17:08:58.129346999 +0000 UTC m=+194.072755131" watchObservedRunningTime="2026-02-15 17:08:58.132239562 +0000 UTC m=+194.075647694" Feb 15 17:08:58 crc kubenswrapper[4585]: I0215 17:08:58.146890 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sbsth" podStartSLOduration=3.091426715 podStartE2EDuration="45.146867845s" podCreationTimestamp="2026-02-15 17:08:13 +0000 UTC" firstStartedPulling="2026-02-15 17:08:15.785339755 +0000 UTC m=+151.728747877" lastFinishedPulling="2026-02-15 17:08:57.840780875 +0000 UTC m=+193.784189007" observedRunningTime="2026-02-15 17:08:58.144280419 +0000 UTC m=+194.087688551" watchObservedRunningTime="2026-02-15 17:08:58.146867845 +0000 UTC m=+194.090275977" Feb 15 17:08:58 crc kubenswrapper[4585]: I0215 17:08:58.165684 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6vk82" podStartSLOduration=3.785352688 podStartE2EDuration="43.165668793s" podCreationTimestamp="2026-02-15 17:08:15 +0000 UTC" firstStartedPulling="2026-02-15 17:08:18.127768034 +0000 UTC m=+154.071176166" lastFinishedPulling="2026-02-15 17:08:57.508084099 +0000 UTC m=+193.451492271" observedRunningTime="2026-02-15 17:08:58.164059832 +0000 UTC m=+194.107467964" watchObservedRunningTime="2026-02-15 17:08:58.165668793 +0000 UTC m=+194.109076925" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.376763 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4x5fb" podStartSLOduration=4.798606363 podStartE2EDuration="44.37673777s" podCreationTimestamp="2026-02-15 17:08:16 +0000 UTC" firstStartedPulling="2026-02-15 17:08:18.124389769 +0000 UTC m=+154.067797901" lastFinishedPulling="2026-02-15 17:08:57.702521176 +0000 UTC m=+193.645929308" observedRunningTime="2026-02-15 17:08:58.184770989 +0000 UTC m=+194.128179121" watchObservedRunningTime="2026-02-15 17:09:00.37673777 +0000 UTC m=+196.320145902" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.378088 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 15 17:09:00 crc kubenswrapper[4585]: E0215 17:09:00.378355 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd" containerName="pruner" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.378376 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd" containerName="pruner" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.378499 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="600e065d-f8eb-4eee-b24a-3fcb1a8c3cfd" containerName="pruner" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.379053 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.380938 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.383185 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.403134 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.483526 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.483639 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.585075 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.585141 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.585218 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.617385 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:00 crc kubenswrapper[4585]: I0215 17:09:00.695325 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:01 crc kubenswrapper[4585]: I0215 17:09:01.128863 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 15 17:09:01 crc kubenswrapper[4585]: W0215 17:09:01.136460 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc0d0c673_192a_4b38_a0c7_e7ebe75839db.slice/crio-dffae7f7f0b510658ee4b71540e60c63a564fb77fd58665970d866432c23b6c0 WatchSource:0}: Error finding container dffae7f7f0b510658ee4b71540e60c63a564fb77fd58665970d866432c23b6c0: Status 404 returned error can't find the container with id dffae7f7f0b510658ee4b71540e60c63a564fb77fd58665970d866432c23b6c0 Feb 15 17:09:02 crc kubenswrapper[4585]: I0215 17:09:02.141947 4585 generic.go:334] "Generic (PLEG): container finished" podID="c0d0c673-192a-4b38-a0c7-e7ebe75839db" containerID="441a53784d651c2ec8d42dc1b1c1474e31f714a17945136cc1b1bcd87208133f" exitCode=0 Feb 15 17:09:02 crc kubenswrapper[4585]: I0215 17:09:02.142063 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c0d0c673-192a-4b38-a0c7-e7ebe75839db","Type":"ContainerDied","Data":"441a53784d651c2ec8d42dc1b1c1474e31f714a17945136cc1b1bcd87208133f"} Feb 15 17:09:02 crc kubenswrapper[4585]: I0215 17:09:02.142298 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c0d0c673-192a-4b38-a0c7-e7ebe75839db","Type":"ContainerStarted","Data":"dffae7f7f0b510658ee4b71540e60c63a564fb77fd58665970d866432c23b6c0"} Feb 15 17:09:03 crc kubenswrapper[4585]: I0215 17:09:03.438192 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:03 crc kubenswrapper[4585]: I0215 17:09:03.624572 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kubelet-dir\") pod \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\" (UID: \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\") " Feb 15 17:09:03 crc kubenswrapper[4585]: I0215 17:09:03.624679 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c0d0c673-192a-4b38-a0c7-e7ebe75839db" (UID: "c0d0c673-192a-4b38-a0c7-e7ebe75839db"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:09:03 crc kubenswrapper[4585]: I0215 17:09:03.624911 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kube-api-access\") pod \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\" (UID: \"c0d0c673-192a-4b38-a0c7-e7ebe75839db\") " Feb 15 17:09:03 crc kubenswrapper[4585]: I0215 17:09:03.625440 4585 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:03 crc kubenswrapper[4585]: I0215 17:09:03.630977 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c0d0c673-192a-4b38-a0c7-e7ebe75839db" (UID: "c0d0c673-192a-4b38-a0c7-e7ebe75839db"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:03 crc kubenswrapper[4585]: I0215 17:09:03.726499 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d0c673-192a-4b38-a0c7-e7ebe75839db-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:03 crc kubenswrapper[4585]: I0215 17:09:03.834304 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:09:03 crc kubenswrapper[4585]: I0215 17:09:03.834410 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:09:04 crc kubenswrapper[4585]: I0215 17:09:04.155774 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 15 17:09:04 crc kubenswrapper[4585]: I0215 17:09:04.155796 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c0d0c673-192a-4b38-a0c7-e7ebe75839db","Type":"ContainerDied","Data":"dffae7f7f0b510658ee4b71540e60c63a564fb77fd58665970d866432c23b6c0"} Feb 15 17:09:04 crc kubenswrapper[4585]: I0215 17:09:04.155845 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dffae7f7f0b510658ee4b71540e60c63a564fb77fd58665970d866432c23b6c0" Feb 15 17:09:04 crc kubenswrapper[4585]: I0215 17:09:04.274330 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:09:04 crc kubenswrapper[4585]: I0215 17:09:04.274402 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:09:04 crc kubenswrapper[4585]: I0215 17:09:04.348806 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:09:04 crc kubenswrapper[4585]: I0215 17:09:04.348905 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:09:05 crc kubenswrapper[4585]: I0215 17:09:05.213986 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:09:05 crc kubenswrapper[4585]: I0215 17:09:05.219472 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:09:05 crc kubenswrapper[4585]: I0215 17:09:05.582316 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlvv9"] Feb 15 17:09:06 crc kubenswrapper[4585]: I0215 17:09:06.108280 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:09:06 crc kubenswrapper[4585]: I0215 17:09:06.109574 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:09:06 crc kubenswrapper[4585]: I0215 17:09:06.147456 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:09:06 crc kubenswrapper[4585]: I0215 17:09:06.211888 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:09:06 crc kubenswrapper[4585]: I0215 17:09:06.427224 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:09:06 crc kubenswrapper[4585]: I0215 17:09:06.427284 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:09:06 crc kubenswrapper[4585]: I0215 17:09:06.490838 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.174254 4585 generic.go:334] "Generic (PLEG): container finished" podID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerID="cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822" exitCode=0 Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.174310 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c48n" event={"ID":"329f56ce-8e35-4eec-adaf-123808e4af4e","Type":"ContainerDied","Data":"cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822"} Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.175369 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wlvv9" podUID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerName="registry-server" containerID="cri-o://900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c" gracePeriod=2 Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.243125 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.564933 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.688513 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-utilities\") pod \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.688562 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvnm2\" (UniqueName: \"kubernetes.io/projected/62ebdb47-d46a-431d-a8ce-b993c2e561dd-kube-api-access-dvnm2\") pod \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.688618 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-catalog-content\") pod \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\" (UID: \"62ebdb47-d46a-431d-a8ce-b993c2e561dd\") " Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.689206 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-utilities" (OuterVolumeSpecName: "utilities") pod "62ebdb47-d46a-431d-a8ce-b993c2e561dd" (UID: "62ebdb47-d46a-431d-a8ce-b993c2e561dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.700402 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ebdb47-d46a-431d-a8ce-b993c2e561dd-kube-api-access-dvnm2" (OuterVolumeSpecName: "kube-api-access-dvnm2") pod "62ebdb47-d46a-431d-a8ce-b993c2e561dd" (UID: "62ebdb47-d46a-431d-a8ce-b993c2e561dd"). InnerVolumeSpecName "kube-api-access-dvnm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.752316 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62ebdb47-d46a-431d-a8ce-b993c2e561dd" (UID: "62ebdb47-d46a-431d-a8ce-b993c2e561dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.789685 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.789934 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvnm2\" (UniqueName: \"kubernetes.io/projected/62ebdb47-d46a-431d-a8ce-b993c2e561dd-kube-api-access-dvnm2\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:07 crc kubenswrapper[4585]: I0215 17:09:07.789948 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62ebdb47-d46a-431d-a8ce-b993c2e561dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.189494 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nshz" event={"ID":"76a47487-6876-4c12-9b15-d8594ad9d748","Type":"ContainerStarted","Data":"f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e"} Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.195447 4585 generic.go:334] "Generic (PLEG): container finished" podID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerID="900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c" exitCode=0 Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.195506 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlvv9" event={"ID":"62ebdb47-d46a-431d-a8ce-b993c2e561dd","Type":"ContainerDied","Data":"900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c"} Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.195529 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlvv9" event={"ID":"62ebdb47-d46a-431d-a8ce-b993c2e561dd","Type":"ContainerDied","Data":"d44fcb7e52e8f11311184be534a6e2e11802052c34b22fd1f28d1d1e1c858f79"} Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.195548 4585 scope.go:117] "RemoveContainer" containerID="900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.195667 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlvv9" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.204895 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c48n" event={"ID":"329f56ce-8e35-4eec-adaf-123808e4af4e","Type":"ContainerStarted","Data":"8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f"} Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.220638 4585 scope.go:117] "RemoveContainer" containerID="15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.241469 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlvv9"] Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.243988 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wlvv9"] Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.245206 4585 scope.go:117] "RemoveContainer" containerID="f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.263997 4585 scope.go:117] "RemoveContainer" containerID="900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c" Feb 15 17:09:08 crc kubenswrapper[4585]: E0215 17:09:08.264344 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c\": container with ID starting with 900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c not found: ID does not exist" containerID="900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.264371 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c"} err="failed to get container status \"900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c\": rpc error: code = NotFound desc = could not find container \"900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c\": container with ID starting with 900efe776286edd496b79257683444354b497b5145fef325ff9455745ccf6f1c not found: ID does not exist" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.264412 4585 scope.go:117] "RemoveContainer" containerID="15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17" Feb 15 17:09:08 crc kubenswrapper[4585]: E0215 17:09:08.264581 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17\": container with ID starting with 15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17 not found: ID does not exist" containerID="15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.264642 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17"} err="failed to get container status \"15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17\": rpc error: code = NotFound desc = could not find container \"15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17\": container with ID starting with 15658dbf510760ad51d9ad6a0dc8bf0b940f771e28af67799ad35bef41a5bc17 not found: ID does not exist" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.264657 4585 scope.go:117] "RemoveContainer" containerID="f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.264826 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4c48n" podStartSLOduration=4.69811234 podStartE2EDuration="55.264807522s" podCreationTimestamp="2026-02-15 17:08:13 +0000 UTC" firstStartedPulling="2026-02-15 17:08:17.028779118 +0000 UTC m=+152.972187250" lastFinishedPulling="2026-02-15 17:09:07.5954743 +0000 UTC m=+203.538882432" observedRunningTime="2026-02-15 17:09:08.254810237 +0000 UTC m=+204.198218369" watchObservedRunningTime="2026-02-15 17:09:08.264807522 +0000 UTC m=+204.208215654" Feb 15 17:09:08 crc kubenswrapper[4585]: E0215 17:09:08.265105 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272\": container with ID starting with f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272 not found: ID does not exist" containerID="f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.265144 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272"} err="failed to get container status \"f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272\": rpc error: code = NotFound desc = could not find container \"f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272\": container with ID starting with f3688ff22424d64017389bdad404b9056d6e7572538d289e13a5e9101007b272 not found: ID does not exist" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.772516 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 15 17:09:08 crc kubenswrapper[4585]: E0215 17:09:08.772719 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerName="registry-server" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.772731 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerName="registry-server" Feb 15 17:09:08 crc kubenswrapper[4585]: E0215 17:09:08.772739 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerName="extract-content" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.772745 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerName="extract-content" Feb 15 17:09:08 crc kubenswrapper[4585]: E0215 17:09:08.772754 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d0c673-192a-4b38-a0c7-e7ebe75839db" containerName="pruner" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.772760 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d0c673-192a-4b38-a0c7-e7ebe75839db" containerName="pruner" Feb 15 17:09:08 crc kubenswrapper[4585]: E0215 17:09:08.772774 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerName="extract-utilities" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.772780 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerName="extract-utilities" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.772868 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d0c673-192a-4b38-a0c7-e7ebe75839db" containerName="pruner" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.772878 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" containerName="registry-server" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.773192 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.776237 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.776424 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.802154 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.818720 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-var-lock\") pod \"installer-9-crc\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.818926 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fd11a6-03b9-427b-8678-ba01bad122cd-kube-api-access\") pod \"installer-9-crc\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.824116 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.850188 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ebdb47-d46a-431d-a8ce-b993c2e561dd" path="/var/lib/kubelet/pods/62ebdb47-d46a-431d-a8ce-b993c2e561dd/volumes" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.926204 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.926308 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-var-lock\") pod \"installer-9-crc\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.926343 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-kubelet-dir\") pod \"installer-9-crc\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.926383 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fd11a6-03b9-427b-8678-ba01bad122cd-kube-api-access\") pod \"installer-9-crc\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.926417 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-var-lock\") pod \"installer-9-crc\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:08 crc kubenswrapper[4585]: I0215 17:09:08.944676 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fd11a6-03b9-427b-8678-ba01bad122cd-kube-api-access\") pod \"installer-9-crc\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:09 crc kubenswrapper[4585]: I0215 17:09:09.087518 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:09 crc kubenswrapper[4585]: I0215 17:09:09.226852 4585 generic.go:334] "Generic (PLEG): container finished" podID="b27ca682-1703-426b-b644-3f28226eda98" containerID="05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3" exitCode=0 Feb 15 17:09:09 crc kubenswrapper[4585]: I0215 17:09:09.227026 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsq6s" event={"ID":"b27ca682-1703-426b-b644-3f28226eda98","Type":"ContainerDied","Data":"05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3"} Feb 15 17:09:09 crc kubenswrapper[4585]: I0215 17:09:09.238503 4585 generic.go:334] "Generic (PLEG): container finished" podID="76a47487-6876-4c12-9b15-d8594ad9d748" containerID="f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e" exitCode=0 Feb 15 17:09:09 crc kubenswrapper[4585]: I0215 17:09:09.238552 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nshz" event={"ID":"76a47487-6876-4c12-9b15-d8594ad9d748","Type":"ContainerDied","Data":"f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e"} Feb 15 17:09:09 crc kubenswrapper[4585]: I0215 17:09:09.347163 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 15 17:09:09 crc kubenswrapper[4585]: I0215 17:09:09.979809 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x5fb"] Feb 15 17:09:09 crc kubenswrapper[4585]: I0215 17:09:09.980392 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4x5fb" podUID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerName="registry-server" containerID="cri-o://09e049094260eaeab6acb7a30dc8b52d299466cafe81865106062dcf3fc06da9" gracePeriod=2 Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.254829 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nshz" event={"ID":"76a47487-6876-4c12-9b15-d8594ad9d748","Type":"ContainerStarted","Data":"eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb"} Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.282613 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8nshz" podStartSLOduration=2.738172722 podStartE2EDuration="54.282585124s" podCreationTimestamp="2026-02-15 17:08:16 +0000 UTC" firstStartedPulling="2026-02-15 17:08:18.133559352 +0000 UTC m=+154.076967484" lastFinishedPulling="2026-02-15 17:09:09.677971744 +0000 UTC m=+205.621379886" observedRunningTime="2026-02-15 17:09:10.280701633 +0000 UTC m=+206.224109765" watchObservedRunningTime="2026-02-15 17:09:10.282585124 +0000 UTC m=+206.225993256" Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.285175 4585 generic.go:334] "Generic (PLEG): container finished" podID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerID="09e049094260eaeab6acb7a30dc8b52d299466cafe81865106062dcf3fc06da9" exitCode=0 Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.285231 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x5fb" event={"ID":"9d0804c7-219d-4c0c-ae97-f7e8ebac6895","Type":"ContainerDied","Data":"09e049094260eaeab6acb7a30dc8b52d299466cafe81865106062dcf3fc06da9"} Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.291393 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsq6s" event={"ID":"b27ca682-1703-426b-b644-3f28226eda98","Type":"ContainerStarted","Data":"f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56"} Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.295378 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"83fd11a6-03b9-427b-8678-ba01bad122cd","Type":"ContainerStarted","Data":"c9c6bbfe5bc5735e9f1d830c37c41316e595d052e7d6d095743a537dd82da747"} Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.295536 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"83fd11a6-03b9-427b-8678-ba01bad122cd","Type":"ContainerStarted","Data":"1d76e22afc68ff9d5824c8579369ffe23abe61d159c1734dea3705b1c83acedd"} Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.318227 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jsq6s" podStartSLOduration=2.838964572 podStartE2EDuration="53.318210406s" podCreationTimestamp="2026-02-15 17:08:17 +0000 UTC" firstStartedPulling="2026-02-15 17:08:19.156786511 +0000 UTC m=+155.100194643" lastFinishedPulling="2026-02-15 17:09:09.636032345 +0000 UTC m=+205.579440477" observedRunningTime="2026-02-15 17:09:10.31764312 +0000 UTC m=+206.261051252" watchObservedRunningTime="2026-02-15 17:09:10.318210406 +0000 UTC m=+206.261618538" Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.342347 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.342315593 podStartE2EDuration="2.342315593s" podCreationTimestamp="2026-02-15 17:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:09:10.333392907 +0000 UTC m=+206.276801049" watchObservedRunningTime="2026-02-15 17:09:10.342315593 +0000 UTC m=+206.285723725" Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.366228 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.450630 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-utilities\") pod \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.450795 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-catalog-content\") pod \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.450939 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6cdl\" (UniqueName: \"kubernetes.io/projected/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-kube-api-access-d6cdl\") pod \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\" (UID: \"9d0804c7-219d-4c0c-ae97-f7e8ebac6895\") " Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.452067 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-utilities" (OuterVolumeSpecName: "utilities") pod "9d0804c7-219d-4c0c-ae97-f7e8ebac6895" (UID: "9d0804c7-219d-4c0c-ae97-f7e8ebac6895"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.458021 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-kube-api-access-d6cdl" (OuterVolumeSpecName: "kube-api-access-d6cdl") pod "9d0804c7-219d-4c0c-ae97-f7e8ebac6895" (UID: "9d0804c7-219d-4c0c-ae97-f7e8ebac6895"). InnerVolumeSpecName "kube-api-access-d6cdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.476834 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d0804c7-219d-4c0c-ae97-f7e8ebac6895" (UID: "9d0804c7-219d-4c0c-ae97-f7e8ebac6895"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.553165 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.553450 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6cdl\" (UniqueName: \"kubernetes.io/projected/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-kube-api-access-d6cdl\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:10 crc kubenswrapper[4585]: I0215 17:09:10.553542 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0804c7-219d-4c0c-ae97-f7e8ebac6895-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:11 crc kubenswrapper[4585]: I0215 17:09:11.304454 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x5fb" event={"ID":"9d0804c7-219d-4c0c-ae97-f7e8ebac6895","Type":"ContainerDied","Data":"53f3a29a3b875d46e6594ffc5cedce52903e2a1c8c8028b49900eb6e934952ea"} Feb 15 17:09:11 crc kubenswrapper[4585]: I0215 17:09:11.305014 4585 scope.go:117] "RemoveContainer" containerID="09e049094260eaeab6acb7a30dc8b52d299466cafe81865106062dcf3fc06da9" Feb 15 17:09:11 crc kubenswrapper[4585]: I0215 17:09:11.304476 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4x5fb" Feb 15 17:09:11 crc kubenswrapper[4585]: I0215 17:09:11.307550 4585 generic.go:334] "Generic (PLEG): container finished" podID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerID="81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b" exitCode=0 Feb 15 17:09:11 crc kubenswrapper[4585]: I0215 17:09:11.307656 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6jz4" event={"ID":"4e461961-6ad6-4275-b1e4-dc540e024b8a","Type":"ContainerDied","Data":"81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b"} Feb 15 17:09:11 crc kubenswrapper[4585]: I0215 17:09:11.322816 4585 scope.go:117] "RemoveContainer" containerID="4c51804d57c5870f51221a309a22d422b26a4f20e88af1db3f4eddc7ba282a2f" Feb 15 17:09:11 crc kubenswrapper[4585]: I0215 17:09:11.351989 4585 scope.go:117] "RemoveContainer" containerID="d7d3319c4e54810987b9dab93efa93d67dc5f53c5e9f4492a39b4f603aa52597" Feb 15 17:09:11 crc kubenswrapper[4585]: I0215 17:09:11.378369 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x5fb"] Feb 15 17:09:11 crc kubenswrapper[4585]: I0215 17:09:11.394606 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x5fb"] Feb 15 17:09:12 crc kubenswrapper[4585]: I0215 17:09:12.314768 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6jz4" event={"ID":"4e461961-6ad6-4275-b1e4-dc540e024b8a","Type":"ContainerStarted","Data":"690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b"} Feb 15 17:09:12 crc kubenswrapper[4585]: I0215 17:09:12.336811 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z6jz4" podStartSLOduration=3.481245654 podStartE2EDuration="58.336798279s" podCreationTimestamp="2026-02-15 17:08:14 +0000 UTC" firstStartedPulling="2026-02-15 17:08:16.878127074 +0000 UTC m=+152.821535206" lastFinishedPulling="2026-02-15 17:09:11.733679699 +0000 UTC m=+207.677087831" observedRunningTime="2026-02-15 17:09:12.335185257 +0000 UTC m=+208.278593399" watchObservedRunningTime="2026-02-15 17:09:12.336798279 +0000 UTC m=+208.280206411" Feb 15 17:09:12 crc kubenswrapper[4585]: I0215 17:09:12.849102 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" path="/var/lib/kubelet/pods/9d0804c7-219d-4c0c-ae97-f7e8ebac6895/volumes" Feb 15 17:09:14 crc kubenswrapper[4585]: I0215 17:09:14.026989 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:09:14 crc kubenswrapper[4585]: I0215 17:09:14.028087 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:09:14 crc kubenswrapper[4585]: I0215 17:09:14.068569 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:09:14 crc kubenswrapper[4585]: I0215 17:09:14.373148 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:09:14 crc kubenswrapper[4585]: I0215 17:09:14.442109 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:09:14 crc kubenswrapper[4585]: I0215 17:09:14.442160 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:09:14 crc kubenswrapper[4585]: I0215 17:09:14.488233 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.006738 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.007156 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.014900 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.014956 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.015000 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.015397 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.015466 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4" gracePeriod=600 Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.072560 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.174182 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" podUID="f1907e91-47d3-4f5f-b701-bcd299d3b95b" containerName="oauth-openshift" containerID="cri-o://f760f5a2efa454041b44bed7a9fdcbb2f99f6a2ab573bb209805e0ad158d6adb" gracePeriod=15 Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.350260 4585 generic.go:334] "Generic (PLEG): container finished" podID="f1907e91-47d3-4f5f-b701-bcd299d3b95b" containerID="f760f5a2efa454041b44bed7a9fdcbb2f99f6a2ab573bb209805e0ad158d6adb" exitCode=0 Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.350314 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" event={"ID":"f1907e91-47d3-4f5f-b701-bcd299d3b95b","Type":"ContainerDied","Data":"f760f5a2efa454041b44bed7a9fdcbb2f99f6a2ab573bb209805e0ad158d6adb"} Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.354664 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4" exitCode=0 Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.354739 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4"} Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.437277 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.437344 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.450980 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.501227 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.622077 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.667936 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-trusted-ca-bundle\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.667989 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-idp-0-file-data\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668012 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-error\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668051 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-provider-selection\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668099 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-service-ca\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668128 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-router-certs\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668150 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-cliconfig\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668195 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp2v4\" (UniqueName: \"kubernetes.io/projected/f1907e91-47d3-4f5f-b701-bcd299d3b95b-kube-api-access-qp2v4\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668225 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-login\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668257 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-serving-cert\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668300 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-ocp-branding-template\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668334 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-session\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668371 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-dir\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.668405 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-policies\") pod \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\" (UID: \"f1907e91-47d3-4f5f-b701-bcd299d3b95b\") " Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.669038 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.669138 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.669798 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.670880 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.675838 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.677793 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1907e91-47d3-4f5f-b701-bcd299d3b95b-kube-api-access-qp2v4" (OuterVolumeSpecName: "kube-api-access-qp2v4") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "kube-api-access-qp2v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.677839 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.677948 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.678169 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.678394 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.678674 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.679099 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.684060 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.684284 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f1907e91-47d3-4f5f-b701-bcd299d3b95b" (UID: "f1907e91-47d3-4f5f-b701-bcd299d3b95b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770484 4585 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770533 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770550 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770565 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770579 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770606 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770618 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770633 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770651 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp2v4\" (UniqueName: \"kubernetes.io/projected/f1907e91-47d3-4f5f-b701-bcd299d3b95b-kube-api-access-qp2v4\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770662 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770673 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770685 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770698 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1907e91-47d3-4f5f-b701-bcd299d3b95b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:17 crc kubenswrapper[4585]: I0215 17:09:17.770709 4585 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1907e91-47d3-4f5f-b701-bcd299d3b95b-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:18 crc kubenswrapper[4585]: I0215 17:09:18.366780 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"a387d5dc4956e239f38a2c3f2fecb007c2e9827c50dc04627a92710ff93e588d"} Feb 15 17:09:18 crc kubenswrapper[4585]: I0215 17:09:18.369000 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" Feb 15 17:09:18 crc kubenswrapper[4585]: I0215 17:09:18.369225 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ncfjb" event={"ID":"f1907e91-47d3-4f5f-b701-bcd299d3b95b","Type":"ContainerDied","Data":"2c7c6ae4eebd0cec2675bd64fadc66efcf12e5742fa0504e760f0e320abed512"} Feb 15 17:09:18 crc kubenswrapper[4585]: I0215 17:09:18.369477 4585 scope.go:117] "RemoveContainer" containerID="f760f5a2efa454041b44bed7a9fdcbb2f99f6a2ab573bb209805e0ad158d6adb" Feb 15 17:09:18 crc kubenswrapper[4585]: I0215 17:09:18.443011 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ncfjb"] Feb 15 17:09:18 crc kubenswrapper[4585]: I0215 17:09:18.451611 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ncfjb"] Feb 15 17:09:18 crc kubenswrapper[4585]: I0215 17:09:18.504190 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:09:18 crc kubenswrapper[4585]: I0215 17:09:18.853269 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1907e91-47d3-4f5f-b701-bcd299d3b95b" path="/var/lib/kubelet/pods/f1907e91-47d3-4f5f-b701-bcd299d3b95b/volumes" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.185177 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jsq6s"] Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.828476 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-786b6d57dd-x4vv5"] Feb 15 17:09:19 crc kubenswrapper[4585]: E0215 17:09:19.828890 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerName="extract-content" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.828916 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerName="extract-content" Feb 15 17:09:19 crc kubenswrapper[4585]: E0215 17:09:19.828936 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerName="registry-server" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.828952 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerName="registry-server" Feb 15 17:09:19 crc kubenswrapper[4585]: E0215 17:09:19.828981 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerName="extract-utilities" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.828998 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerName="extract-utilities" Feb 15 17:09:19 crc kubenswrapper[4585]: E0215 17:09:19.829033 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1907e91-47d3-4f5f-b701-bcd299d3b95b" containerName="oauth-openshift" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.829045 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1907e91-47d3-4f5f-b701-bcd299d3b95b" containerName="oauth-openshift" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.829250 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1907e91-47d3-4f5f-b701-bcd299d3b95b" containerName="oauth-openshift" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.829273 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0804c7-219d-4c0c-ae97-f7e8ebac6895" containerName="registry-server" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.829971 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.834266 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.835846 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.835868 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.837748 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.837776 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.837933 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.838121 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.850925 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.851178 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.851343 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.851585 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.860071 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.861323 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.864214 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.870536 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-786b6d57dd-x4vv5"] Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.882261 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905133 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-template-error\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905223 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905276 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-template-login\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905318 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905389 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905447 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905475 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-audit-policies\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905503 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhqk9\" (UniqueName: \"kubernetes.io/projected/dd5a5523-565c-421f-aa56-87d396f6e0e4-kube-api-access-nhqk9\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905529 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-session\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905571 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905669 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905693 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.905724 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:19 crc kubenswrapper[4585]: I0215 17:09:19.907347 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd5a5523-565c-421f-aa56-87d396f6e0e4-audit-dir\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008504 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhqk9\" (UniqueName: \"kubernetes.io/projected/dd5a5523-565c-421f-aa56-87d396f6e0e4-kube-api-access-nhqk9\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008558 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-session\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008614 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008660 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008686 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008714 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008746 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd5a5523-565c-421f-aa56-87d396f6e0e4-audit-dir\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008785 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-template-error\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008807 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008830 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-template-login\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008858 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008895 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008923 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.008946 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-audit-policies\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.009726 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-audit-policies\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.009750 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.010868 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.010891 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dd5a5523-565c-421f-aa56-87d396f6e0e4-audit-dir\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.011722 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.016118 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-template-error\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.016379 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-session\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.017026 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-template-login\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.018808 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.020040 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.023095 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.023372 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.024931 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd5a5523-565c-421f-aa56-87d396f6e0e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.040752 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhqk9\" (UniqueName: \"kubernetes.io/projected/dd5a5523-565c-421f-aa56-87d396f6e0e4-kube-api-access-nhqk9\") pod \"oauth-openshift-786b6d57dd-x4vv5\" (UID: \"dd5a5523-565c-421f-aa56-87d396f6e0e4\") " pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.161645 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.351606 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-786b6d57dd-x4vv5"] Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.383648 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" event={"ID":"dd5a5523-565c-421f-aa56-87d396f6e0e4","Type":"ContainerStarted","Data":"6d8782144230d0b7021277543839ac35d80e93da0eef08ca4ff7b6c949ff3b5f"} Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.383839 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jsq6s" podUID="b27ca682-1703-426b-b644-3f28226eda98" containerName="registry-server" containerID="cri-o://f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56" gracePeriod=2 Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.736891 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.818125 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-utilities\") pod \"b27ca682-1703-426b-b644-3f28226eda98\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.818270 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-catalog-content\") pod \"b27ca682-1703-426b-b644-3f28226eda98\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.818381 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb8pk\" (UniqueName: \"kubernetes.io/projected/b27ca682-1703-426b-b644-3f28226eda98-kube-api-access-pb8pk\") pod \"b27ca682-1703-426b-b644-3f28226eda98\" (UID: \"b27ca682-1703-426b-b644-3f28226eda98\") " Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.819891 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-utilities" (OuterVolumeSpecName: "utilities") pod "b27ca682-1703-426b-b644-3f28226eda98" (UID: "b27ca682-1703-426b-b644-3f28226eda98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.823815 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b27ca682-1703-426b-b644-3f28226eda98-kube-api-access-pb8pk" (OuterVolumeSpecName: "kube-api-access-pb8pk") pod "b27ca682-1703-426b-b644-3f28226eda98" (UID: "b27ca682-1703-426b-b644-3f28226eda98"). InnerVolumeSpecName "kube-api-access-pb8pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.920093 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb8pk\" (UniqueName: \"kubernetes.io/projected/b27ca682-1703-426b-b644-3f28226eda98-kube-api-access-pb8pk\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.920121 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:20 crc kubenswrapper[4585]: I0215 17:09:20.958764 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b27ca682-1703-426b-b644-3f28226eda98" (UID: "b27ca682-1703-426b-b644-3f28226eda98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.022199 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b27ca682-1703-426b-b644-3f28226eda98-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.394982 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" event={"ID":"dd5a5523-565c-421f-aa56-87d396f6e0e4","Type":"ContainerStarted","Data":"f94a3c4c1cace8d9bc72ef5adb405db4125bc8e1f472d97f300fe0b9ddb160a0"} Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.395804 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.403710 4585 generic.go:334] "Generic (PLEG): container finished" podID="b27ca682-1703-426b-b644-3f28226eda98" containerID="f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56" exitCode=0 Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.403788 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsq6s" event={"ID":"b27ca682-1703-426b-b644-3f28226eda98","Type":"ContainerDied","Data":"f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56"} Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.403834 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsq6s" event={"ID":"b27ca682-1703-426b-b644-3f28226eda98","Type":"ContainerDied","Data":"b263e182aa1c8e85c4cd5bbb5fbe190e2a94661057126ad9d10a4f61e0106815"} Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.403863 4585 scope.go:117] "RemoveContainer" containerID="f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.404854 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsq6s" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.409775 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.426360 4585 scope.go:117] "RemoveContainer" containerID="05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.443879 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-786b6d57dd-x4vv5" podStartSLOduration=29.443853893 podStartE2EDuration="29.443853893s" podCreationTimestamp="2026-02-15 17:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:09:21.441544922 +0000 UTC m=+217.384953094" watchObservedRunningTime="2026-02-15 17:09:21.443853893 +0000 UTC m=+217.387262035" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.485152 4585 scope.go:117] "RemoveContainer" containerID="ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.523762 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jsq6s"] Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.523878 4585 scope.go:117] "RemoveContainer" containerID="f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56" Feb 15 17:09:21 crc kubenswrapper[4585]: E0215 17:09:21.524309 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56\": container with ID starting with f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56 not found: ID does not exist" containerID="f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.524349 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56"} err="failed to get container status \"f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56\": rpc error: code = NotFound desc = could not find container \"f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56\": container with ID starting with f838ad45d768bb30803669ad5073f7f028965061d16928ac8435b16f0a699d56 not found: ID does not exist" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.524397 4585 scope.go:117] "RemoveContainer" containerID="05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3" Feb 15 17:09:21 crc kubenswrapper[4585]: E0215 17:09:21.524682 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3\": container with ID starting with 05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3 not found: ID does not exist" containerID="05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.524733 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3"} err="failed to get container status \"05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3\": rpc error: code = NotFound desc = could not find container \"05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3\": container with ID starting with 05c29a8e5dab8c6558e294a73e921851c00f6e9d1d1d0a1c046e56cf6e8d7ec3 not found: ID does not exist" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.524751 4585 scope.go:117] "RemoveContainer" containerID="ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4" Feb 15 17:09:21 crc kubenswrapper[4585]: E0215 17:09:21.525031 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4\": container with ID starting with ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4 not found: ID does not exist" containerID="ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.525081 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4"} err="failed to get container status \"ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4\": rpc error: code = NotFound desc = could not find container \"ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4\": container with ID starting with ec73ccaed88c249bab4f9eab511f5ec47c6b07b29ec2e3cb5c01e0e7e5f4a6d4 not found: ID does not exist" Feb 15 17:09:21 crc kubenswrapper[4585]: I0215 17:09:21.528597 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jsq6s"] Feb 15 17:09:22 crc kubenswrapper[4585]: I0215 17:09:22.852579 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b27ca682-1703-426b-b644-3f28226eda98" path="/var/lib/kubelet/pods/b27ca682-1703-426b-b644-3f28226eda98/volumes" Feb 15 17:09:24 crc kubenswrapper[4585]: I0215 17:09:24.498152 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:09:24 crc kubenswrapper[4585]: I0215 17:09:24.559869 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6jz4"] Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.428560 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z6jz4" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerName="registry-server" containerID="cri-o://690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b" gracePeriod=2 Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.778412 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.807851 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-catalog-content\") pod \"4e461961-6ad6-4275-b1e4-dc540e024b8a\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.807942 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nqdv\" (UniqueName: \"kubernetes.io/projected/4e461961-6ad6-4275-b1e4-dc540e024b8a-kube-api-access-4nqdv\") pod \"4e461961-6ad6-4275-b1e4-dc540e024b8a\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.808008 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-utilities\") pod \"4e461961-6ad6-4275-b1e4-dc540e024b8a\" (UID: \"4e461961-6ad6-4275-b1e4-dc540e024b8a\") " Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.808853 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-utilities" (OuterVolumeSpecName: "utilities") pod "4e461961-6ad6-4275-b1e4-dc540e024b8a" (UID: "4e461961-6ad6-4275-b1e4-dc540e024b8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.819556 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e461961-6ad6-4275-b1e4-dc540e024b8a-kube-api-access-4nqdv" (OuterVolumeSpecName: "kube-api-access-4nqdv") pod "4e461961-6ad6-4275-b1e4-dc540e024b8a" (UID: "4e461961-6ad6-4275-b1e4-dc540e024b8a"). InnerVolumeSpecName "kube-api-access-4nqdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.860972 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e461961-6ad6-4275-b1e4-dc540e024b8a" (UID: "4e461961-6ad6-4275-b1e4-dc540e024b8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.909626 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nqdv\" (UniqueName: \"kubernetes.io/projected/4e461961-6ad6-4275-b1e4-dc540e024b8a-kube-api-access-4nqdv\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.909678 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:25 crc kubenswrapper[4585]: I0215 17:09:25.909690 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e461961-6ad6-4275-b1e4-dc540e024b8a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.435399 4585 generic.go:334] "Generic (PLEG): container finished" podID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerID="690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b" exitCode=0 Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.435585 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6jz4" event={"ID":"4e461961-6ad6-4275-b1e4-dc540e024b8a","Type":"ContainerDied","Data":"690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b"} Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.435922 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6jz4" event={"ID":"4e461961-6ad6-4275-b1e4-dc540e024b8a","Type":"ContainerDied","Data":"8cba02e6c0a49ad15f139df883b59b6da57da02d59900e0891a9e7de2ae79599"} Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.435853 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6jz4" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.435963 4585 scope.go:117] "RemoveContainer" containerID="690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.464312 4585 scope.go:117] "RemoveContainer" containerID="81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.479169 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z6jz4"] Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.491857 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z6jz4"] Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.493759 4585 scope.go:117] "RemoveContainer" containerID="6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.506498 4585 scope.go:117] "RemoveContainer" containerID="690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b" Feb 15 17:09:26 crc kubenswrapper[4585]: E0215 17:09:26.506957 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b\": container with ID starting with 690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b not found: ID does not exist" containerID="690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.507012 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b"} err="failed to get container status \"690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b\": rpc error: code = NotFound desc = could not find container \"690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b\": container with ID starting with 690f37690d85272dba461fa51b325638b1f6ab88c8c28186de756651189eb83b not found: ID does not exist" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.507045 4585 scope.go:117] "RemoveContainer" containerID="81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b" Feb 15 17:09:26 crc kubenswrapper[4585]: E0215 17:09:26.507323 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b\": container with ID starting with 81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b not found: ID does not exist" containerID="81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.507353 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b"} err="failed to get container status \"81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b\": rpc error: code = NotFound desc = could not find container \"81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b\": container with ID starting with 81f5d6c26638bc98f4278df57dd2a3ca81c1fca6f7727d8df4734ebb5405b05b not found: ID does not exist" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.507369 4585 scope.go:117] "RemoveContainer" containerID="6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d" Feb 15 17:09:26 crc kubenswrapper[4585]: E0215 17:09:26.507622 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d\": container with ID starting with 6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d not found: ID does not exist" containerID="6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.507640 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d"} err="failed to get container status \"6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d\": rpc error: code = NotFound desc = could not find container \"6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d\": container with ID starting with 6c591f4bcdef1f89cd6416c188c252508c8defaf9632c71f7eef990f9a11461d not found: ID does not exist" Feb 15 17:09:26 crc kubenswrapper[4585]: I0215 17:09:26.848257 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" path="/var/lib/kubelet/pods/4e461961-6ad6-4275-b1e4-dc540e024b8a/volumes" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.574101 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sbsth"] Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.576057 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sbsth" podUID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerName="registry-server" containerID="cri-o://ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00" gracePeriod=30 Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.581949 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4c48n"] Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.582166 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4c48n" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerName="registry-server" containerID="cri-o://8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f" gracePeriod=30 Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.588955 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z5d9n"] Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.589389 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" podUID="bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" containerName="marketplace-operator" containerID="cri-o://854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f" gracePeriod=30 Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.605992 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vk82"] Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.606360 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6vk82" podUID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerName="registry-server" containerID="cri-o://b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a" gracePeriod=30 Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.620148 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h2jdb"] Feb 15 17:09:42 crc kubenswrapper[4585]: E0215 17:09:42.620451 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerName="registry-server" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.620473 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerName="registry-server" Feb 15 17:09:42 crc kubenswrapper[4585]: E0215 17:09:42.620497 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27ca682-1703-426b-b644-3f28226eda98" containerName="extract-utilities" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.620512 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27ca682-1703-426b-b644-3f28226eda98" containerName="extract-utilities" Feb 15 17:09:42 crc kubenswrapper[4585]: E0215 17:09:42.620538 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27ca682-1703-426b-b644-3f28226eda98" containerName="registry-server" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.620549 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27ca682-1703-426b-b644-3f28226eda98" containerName="registry-server" Feb 15 17:09:42 crc kubenswrapper[4585]: E0215 17:09:42.620567 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerName="extract-content" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.620578 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerName="extract-content" Feb 15 17:09:42 crc kubenswrapper[4585]: E0215 17:09:42.620595 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27ca682-1703-426b-b644-3f28226eda98" containerName="extract-content" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.620634 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27ca682-1703-426b-b644-3f28226eda98" containerName="extract-content" Feb 15 17:09:42 crc kubenswrapper[4585]: E0215 17:09:42.620653 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerName="extract-utilities" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.620665 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerName="extract-utilities" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.620865 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e461961-6ad6-4275-b1e4-dc540e024b8a" containerName="registry-server" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.620891 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b27ca682-1703-426b-b644-3f28226eda98" containerName="registry-server" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.621310 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.624569 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8nshz"] Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.624800 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8nshz" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" containerName="registry-server" containerID="cri-o://eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb" gracePeriod=30 Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.628398 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h2jdb"] Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.735611 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cmtr\" (UniqueName: \"kubernetes.io/projected/78351f70-0518-4b26-b551-48b047371fa7-kube-api-access-2cmtr\") pod \"marketplace-operator-79b997595-h2jdb\" (UID: \"78351f70-0518-4b26-b551-48b047371fa7\") " pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.735666 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78351f70-0518-4b26-b551-48b047371fa7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h2jdb\" (UID: \"78351f70-0518-4b26-b551-48b047371fa7\") " pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.735727 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78351f70-0518-4b26-b551-48b047371fa7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h2jdb\" (UID: \"78351f70-0518-4b26-b551-48b047371fa7\") " pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.836959 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78351f70-0518-4b26-b551-48b047371fa7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h2jdb\" (UID: \"78351f70-0518-4b26-b551-48b047371fa7\") " pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.837009 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cmtr\" (UniqueName: \"kubernetes.io/projected/78351f70-0518-4b26-b551-48b047371fa7-kube-api-access-2cmtr\") pod \"marketplace-operator-79b997595-h2jdb\" (UID: \"78351f70-0518-4b26-b551-48b047371fa7\") " pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.837032 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78351f70-0518-4b26-b551-48b047371fa7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h2jdb\" (UID: \"78351f70-0518-4b26-b551-48b047371fa7\") " pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.838253 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78351f70-0518-4b26-b551-48b047371fa7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h2jdb\" (UID: \"78351f70-0518-4b26-b551-48b047371fa7\") " pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.859463 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78351f70-0518-4b26-b551-48b047371fa7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h2jdb\" (UID: \"78351f70-0518-4b26-b551-48b047371fa7\") " pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.860381 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cmtr\" (UniqueName: \"kubernetes.io/projected/78351f70-0518-4b26-b551-48b047371fa7-kube-api-access-2cmtr\") pod \"marketplace-operator-79b997595-h2jdb\" (UID: \"78351f70-0518-4b26-b551-48b047371fa7\") " pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:42 crc kubenswrapper[4585]: I0215 17:09:42.943750 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.072360 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.119379 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.128728 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.136825 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.168152 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.173994 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gwvw\" (UniqueName: \"kubernetes.io/projected/76a47487-6876-4c12-9b15-d8594ad9d748-kube-api-access-9gwvw\") pod \"76a47487-6876-4c12-9b15-d8594ad9d748\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174038 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-trusted-ca\") pod \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174069 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-catalog-content\") pod \"76a47487-6876-4c12-9b15-d8594ad9d748\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174089 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbvz4\" (UniqueName: \"kubernetes.io/projected/329f56ce-8e35-4eec-adaf-123808e4af4e-kube-api-access-vbvz4\") pod \"329f56ce-8e35-4eec-adaf-123808e4af4e\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174137 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-operator-metrics\") pod \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174160 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-utilities\") pod \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174184 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rt7b\" (UniqueName: \"kubernetes.io/projected/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-kube-api-access-5rt7b\") pod \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\" (UID: \"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174223 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-utilities\") pod \"329f56ce-8e35-4eec-adaf-123808e4af4e\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174255 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hqn8\" (UniqueName: \"kubernetes.io/projected/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-kube-api-access-7hqn8\") pod \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174280 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-catalog-content\") pod \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\" (UID: \"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174314 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-catalog-content\") pod \"329f56ce-8e35-4eec-adaf-123808e4af4e\" (UID: \"329f56ce-8e35-4eec-adaf-123808e4af4e\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.174351 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-utilities\") pod \"76a47487-6876-4c12-9b15-d8594ad9d748\" (UID: \"76a47487-6876-4c12-9b15-d8594ad9d748\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.179638 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-utilities" (OuterVolumeSpecName: "utilities") pod "76a47487-6876-4c12-9b15-d8594ad9d748" (UID: "76a47487-6876-4c12-9b15-d8594ad9d748"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.183008 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-utilities" (OuterVolumeSpecName: "utilities") pod "a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" (UID: "a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.187023 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" (UID: "bda1e602-f2c2-44f9-8b27-2b4e25eab6d1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.197002 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-utilities" (OuterVolumeSpecName: "utilities") pod "329f56ce-8e35-4eec-adaf-123808e4af4e" (UID: "329f56ce-8e35-4eec-adaf-123808e4af4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.216823 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329f56ce-8e35-4eec-adaf-123808e4af4e-kube-api-access-vbvz4" (OuterVolumeSpecName: "kube-api-access-vbvz4") pod "329f56ce-8e35-4eec-adaf-123808e4af4e" (UID: "329f56ce-8e35-4eec-adaf-123808e4af4e"). InnerVolumeSpecName "kube-api-access-vbvz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.217147 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a47487-6876-4c12-9b15-d8594ad9d748-kube-api-access-9gwvw" (OuterVolumeSpecName: "kube-api-access-9gwvw") pod "76a47487-6876-4c12-9b15-d8594ad9d748" (UID: "76a47487-6876-4c12-9b15-d8594ad9d748"). InnerVolumeSpecName "kube-api-access-9gwvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.217204 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-kube-api-access-7hqn8" (OuterVolumeSpecName: "kube-api-access-7hqn8") pod "a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" (UID: "a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4"). InnerVolumeSpecName "kube-api-access-7hqn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.217583 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-kube-api-access-5rt7b" (OuterVolumeSpecName: "kube-api-access-5rt7b") pod "bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" (UID: "bda1e602-f2c2-44f9-8b27-2b4e25eab6d1"). InnerVolumeSpecName "kube-api-access-5rt7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.218663 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" (UID: "bda1e602-f2c2-44f9-8b27-2b4e25eab6d1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.270177 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" (UID: "a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.277911 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-catalog-content\") pod \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.277964 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-utilities\") pod \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278035 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lr2c\" (UniqueName: \"kubernetes.io/projected/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-kube-api-access-5lr2c\") pod \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\" (UID: \"d3beabcf-c3ca-49e7-a5e5-5719f184fab7\") " Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278476 4585 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278494 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278523 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rt7b\" (UniqueName: \"kubernetes.io/projected/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-kube-api-access-5rt7b\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278534 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278542 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hqn8\" (UniqueName: \"kubernetes.io/projected/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-kube-api-access-7hqn8\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278550 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278559 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278567 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gwvw\" (UniqueName: \"kubernetes.io/projected/76a47487-6876-4c12-9b15-d8594ad9d748-kube-api-access-9gwvw\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278576 4585 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.278615 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbvz4\" (UniqueName: \"kubernetes.io/projected/329f56ce-8e35-4eec-adaf-123808e4af4e-kube-api-access-vbvz4\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.281017 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-utilities" (OuterVolumeSpecName: "utilities") pod "d3beabcf-c3ca-49e7-a5e5-5719f184fab7" (UID: "d3beabcf-c3ca-49e7-a5e5-5719f184fab7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.282751 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-kube-api-access-5lr2c" (OuterVolumeSpecName: "kube-api-access-5lr2c") pod "d3beabcf-c3ca-49e7-a5e5-5719f184fab7" (UID: "d3beabcf-c3ca-49e7-a5e5-5719f184fab7"). InnerVolumeSpecName "kube-api-access-5lr2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.313969 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "329f56ce-8e35-4eec-adaf-123808e4af4e" (UID: "329f56ce-8e35-4eec-adaf-123808e4af4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.315999 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3beabcf-c3ca-49e7-a5e5-5719f184fab7" (UID: "d3beabcf-c3ca-49e7-a5e5-5719f184fab7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.379331 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.379354 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.379364 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/329f56ce-8e35-4eec-adaf-123808e4af4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.379372 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lr2c\" (UniqueName: \"kubernetes.io/projected/d3beabcf-c3ca-49e7-a5e5-5719f184fab7-kube-api-access-5lr2c\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.420327 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76a47487-6876-4c12-9b15-d8594ad9d748" (UID: "76a47487-6876-4c12-9b15-d8594ad9d748"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.471240 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h2jdb"] Feb 15 17:09:43 crc kubenswrapper[4585]: W0215 17:09:43.476759 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78351f70_0518_4b26_b551_48b047371fa7.slice/crio-92ff2644b54a3d0dc70f47ec119f016ead69c25f0c7bf10998071e5552aae08e WatchSource:0}: Error finding container 92ff2644b54a3d0dc70f47ec119f016ead69c25f0c7bf10998071e5552aae08e: Status 404 returned error can't find the container with id 92ff2644b54a3d0dc70f47ec119f016ead69c25f0c7bf10998071e5552aae08e Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.479927 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76a47487-6876-4c12-9b15-d8594ad9d748-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.556929 4585 generic.go:334] "Generic (PLEG): container finished" podID="76a47487-6876-4c12-9b15-d8594ad9d748" containerID="eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb" exitCode=0 Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.556981 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nshz" event={"ID":"76a47487-6876-4c12-9b15-d8594ad9d748","Type":"ContainerDied","Data":"eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.557035 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nshz" event={"ID":"76a47487-6876-4c12-9b15-d8594ad9d748","Type":"ContainerDied","Data":"094f855cd11a8fc900a090132f3168ee1356d52dd22272a3d0dcb87f557cc045"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.557042 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8nshz" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.557054 4585 scope.go:117] "RemoveContainer" containerID="eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.558761 4585 generic.go:334] "Generic (PLEG): container finished" podID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerID="ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00" exitCode=0 Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.558803 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbsth" event={"ID":"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4","Type":"ContainerDied","Data":"ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.558835 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sbsth" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.558866 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sbsth" event={"ID":"a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4","Type":"ContainerDied","Data":"361c4f68b6ebbd666e53f0e075a875c539c873dd7d8540aa501673eebfb74c25"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.562480 4585 generic.go:334] "Generic (PLEG): container finished" podID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerID="8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f" exitCode=0 Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.562589 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c48n" event={"ID":"329f56ce-8e35-4eec-adaf-123808e4af4e","Type":"ContainerDied","Data":"8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.562622 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4c48n" event={"ID":"329f56ce-8e35-4eec-adaf-123808e4af4e","Type":"ContainerDied","Data":"e96e78abf43ca0dc906cdb3f3be4dc4a1aa3ea23fcc46744b884a2315d0a26df"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.562696 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4c48n" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.569791 4585 generic.go:334] "Generic (PLEG): container finished" podID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerID="b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a" exitCode=0 Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.569872 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vk82" event={"ID":"d3beabcf-c3ca-49e7-a5e5-5719f184fab7","Type":"ContainerDied","Data":"b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.569900 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6vk82" event={"ID":"d3beabcf-c3ca-49e7-a5e5-5719f184fab7","Type":"ContainerDied","Data":"613aa137e89e3a7d4b111cd788b87748782a516a498dae72ab1c4bf691058559"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.569992 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6vk82" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.573187 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" event={"ID":"78351f70-0518-4b26-b551-48b047371fa7","Type":"ContainerStarted","Data":"92ff2644b54a3d0dc70f47ec119f016ead69c25f0c7bf10998071e5552aae08e"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.576824 4585 generic.go:334] "Generic (PLEG): container finished" podID="bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" containerID="854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f" exitCode=0 Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.576858 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" event={"ID":"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1","Type":"ContainerDied","Data":"854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.576876 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" event={"ID":"bda1e602-f2c2-44f9-8b27-2b4e25eab6d1","Type":"ContainerDied","Data":"7270427d4f263e119dc67dc314cf7d59e02132f3e662cb2e577c45d6d77c57d7"} Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.576989 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z5d9n" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.585958 4585 scope.go:117] "RemoveContainer" containerID="f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.615757 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8nshz"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.642016 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8nshz"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.642109 4585 scope.go:117] "RemoveContainer" containerID="1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.649083 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sbsth"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.662983 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sbsth"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.667006 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4c48n"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.667761 4585 scope.go:117] "RemoveContainer" containerID="eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.668273 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb\": container with ID starting with eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb not found: ID does not exist" containerID="eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.668302 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb"} err="failed to get container status \"eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb\": rpc error: code = NotFound desc = could not find container \"eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb\": container with ID starting with eca8522ea9bcefb32c4c5e9f356ddf59c22b6322967f5eb79dd08839a3f380eb not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.668325 4585 scope.go:117] "RemoveContainer" containerID="f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.669569 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e\": container with ID starting with f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e not found: ID does not exist" containerID="f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.669613 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e"} err="failed to get container status \"f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e\": rpc error: code = NotFound desc = could not find container \"f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e\": container with ID starting with f43639eff972a02223512d4ee587d210065f2d8c308a8583a7b9aab2a09cb42e not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.669630 4585 scope.go:117] "RemoveContainer" containerID="1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.670469 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946\": container with ID starting with 1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946 not found: ID does not exist" containerID="1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.670492 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946"} err="failed to get container status \"1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946\": rpc error: code = NotFound desc = could not find container \"1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946\": container with ID starting with 1c574f4b14607d860dd8f3c74d6025e13a6d6e3ee6df80d22835ce67afe16946 not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.670507 4585 scope.go:117] "RemoveContainer" containerID="ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.677865 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4c48n"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.680265 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z5d9n"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.682647 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z5d9n"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.684657 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vk82"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.684884 4585 scope.go:117] "RemoveContainer" containerID="96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.686841 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6vk82"] Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.708352 4585 scope.go:117] "RemoveContainer" containerID="a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.751674 4585 scope.go:117] "RemoveContainer" containerID="ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.752218 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00\": container with ID starting with ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00 not found: ID does not exist" containerID="ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.752260 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00"} err="failed to get container status \"ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00\": rpc error: code = NotFound desc = could not find container \"ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00\": container with ID starting with ab764599fc83cb720d794492457a77359670da4e3cd8df66b1331451c33ada00 not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.752292 4585 scope.go:117] "RemoveContainer" containerID="96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.752514 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089\": container with ID starting with 96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089 not found: ID does not exist" containerID="96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.752537 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089"} err="failed to get container status \"96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089\": rpc error: code = NotFound desc = could not find container \"96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089\": container with ID starting with 96f5c1ab5829f40a97c5805fd32dede10b6121246a085a31242aad336c74b089 not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.752551 4585 scope.go:117] "RemoveContainer" containerID="a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.752754 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f\": container with ID starting with a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f not found: ID does not exist" containerID="a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.752778 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f"} err="failed to get container status \"a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f\": rpc error: code = NotFound desc = could not find container \"a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f\": container with ID starting with a7dcf8392475faf6662269409055ae9d0725afdf0409a381c775764eae90da4f not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.752792 4585 scope.go:117] "RemoveContainer" containerID="8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.776366 4585 scope.go:117] "RemoveContainer" containerID="cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.792405 4585 scope.go:117] "RemoveContainer" containerID="bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.807783 4585 scope.go:117] "RemoveContainer" containerID="8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.808293 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f\": container with ID starting with 8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f not found: ID does not exist" containerID="8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.808336 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f"} err="failed to get container status \"8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f\": rpc error: code = NotFound desc = could not find container \"8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f\": container with ID starting with 8095aa7614bb5328d13b435b07bb54b074d3a0f67f36b68527b5c7b9090b3c6f not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.808370 4585 scope.go:117] "RemoveContainer" containerID="cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.808750 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822\": container with ID starting with cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822 not found: ID does not exist" containerID="cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.808781 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822"} err="failed to get container status \"cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822\": rpc error: code = NotFound desc = could not find container \"cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822\": container with ID starting with cbcf25031eac1d978733ccf02832e2c165e22980d0d8959285642e5d52f4c822 not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.808805 4585 scope.go:117] "RemoveContainer" containerID="bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.809148 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48\": container with ID starting with bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48 not found: ID does not exist" containerID="bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.809166 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48"} err="failed to get container status \"bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48\": rpc error: code = NotFound desc = could not find container \"bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48\": container with ID starting with bf9073697ddcf4eb214a074f1b701a007a8693932fcb3f743d790c976d002c48 not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.809180 4585 scope.go:117] "RemoveContainer" containerID="b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.832941 4585 scope.go:117] "RemoveContainer" containerID="7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.851863 4585 scope.go:117] "RemoveContainer" containerID="1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.865194 4585 scope.go:117] "RemoveContainer" containerID="b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.865555 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a\": container with ID starting with b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a not found: ID does not exist" containerID="b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.865648 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a"} err="failed to get container status \"b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a\": rpc error: code = NotFound desc = could not find container \"b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a\": container with ID starting with b1293a0a6acb81b22e59e715fcefad1720b6b1d7d135ea16dc653debe9c32d0a not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.865686 4585 scope.go:117] "RemoveContainer" containerID="7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.866146 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1\": container with ID starting with 7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1 not found: ID does not exist" containerID="7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.866167 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1"} err="failed to get container status \"7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1\": rpc error: code = NotFound desc = could not find container \"7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1\": container with ID starting with 7e6b043479ba806e1f9c032592a3b206bc17a300c04104c69fb02e40fcc415c1 not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.866180 4585 scope.go:117] "RemoveContainer" containerID="1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.866446 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d\": container with ID starting with 1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d not found: ID does not exist" containerID="1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.866475 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d"} err="failed to get container status \"1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d\": rpc error: code = NotFound desc = could not find container \"1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d\": container with ID starting with 1256ce024451dd54bd4a979f558c0865405fa449f810493c668066eafb75ce3d not found: ID does not exist" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.866488 4585 scope.go:117] "RemoveContainer" containerID="854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.877665 4585 scope.go:117] "RemoveContainer" containerID="854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f" Feb 15 17:09:43 crc kubenswrapper[4585]: E0215 17:09:43.878015 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f\": container with ID starting with 854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f not found: ID does not exist" containerID="854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f" Feb 15 17:09:43 crc kubenswrapper[4585]: I0215 17:09:43.878061 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f"} err="failed to get container status \"854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f\": rpc error: code = NotFound desc = could not find container \"854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f\": container with ID starting with 854aa13d5b9fc8d0b1854f15a46a362dffa7421d42a9181ce8565ace35ccb01f not found: ID does not exist" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.582682 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" event={"ID":"78351f70-0518-4b26-b551-48b047371fa7","Type":"ContainerStarted","Data":"5dcbb4db93d54caddf14b82b15b57f2874d53e45ca25cc9c14440d667b497df2"} Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.583091 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.587975 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.625376 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-h2jdb" podStartSLOduration=2.6253429280000002 podStartE2EDuration="2.625342928s" podCreationTimestamp="2026-02-15 17:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:09:44.6045855 +0000 UTC m=+240.547993632" watchObservedRunningTime="2026-02-15 17:09:44.625342928 +0000 UTC m=+240.568751070" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.790377 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-48nwt"] Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791268 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerName="extract-utilities" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791291 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerName="extract-utilities" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791304 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" containerName="marketplace-operator" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791312 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" containerName="marketplace-operator" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791320 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791327 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791336 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerName="extract-content" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791343 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerName="extract-content" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791352 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791358 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791368 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791375 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791387 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" containerName="extract-content" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791395 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" containerName="extract-content" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791405 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerName="extract-content" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791414 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerName="extract-content" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791423 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791429 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791438 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerName="extract-utilities" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791446 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerName="extract-utilities" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791456 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" containerName="extract-utilities" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791463 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" containerName="extract-utilities" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791472 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerName="extract-utilities" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791480 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerName="extract-utilities" Feb 15 17:09:44 crc kubenswrapper[4585]: E0215 17:09:44.791493 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerName="extract-content" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791500 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerName="extract-content" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791639 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791652 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791665 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791676 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" containerName="registry-server" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.791690 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" containerName="marketplace-operator" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.792660 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.795156 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.805043 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48nwt"] Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.857388 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329f56ce-8e35-4eec-adaf-123808e4af4e" path="/var/lib/kubelet/pods/329f56ce-8e35-4eec-adaf-123808e4af4e/volumes" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.858512 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a47487-6876-4c12-9b15-d8594ad9d748" path="/var/lib/kubelet/pods/76a47487-6876-4c12-9b15-d8594ad9d748/volumes" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.860324 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4" path="/var/lib/kubelet/pods/a4d1c19e-a41e-4dce-ba27-bfe2967ce0a4/volumes" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.861736 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda1e602-f2c2-44f9-8b27-2b4e25eab6d1" path="/var/lib/kubelet/pods/bda1e602-f2c2-44f9-8b27-2b4e25eab6d1/volumes" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.862354 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3beabcf-c3ca-49e7-a5e5-5719f184fab7" path="/var/lib/kubelet/pods/d3beabcf-c3ca-49e7-a5e5-5719f184fab7/volumes" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.892915 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7174606f-3071-43e5-88ce-c9e22d408b2e-utilities\") pod \"redhat-marketplace-48nwt\" (UID: \"7174606f-3071-43e5-88ce-c9e22d408b2e\") " pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.892991 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7174606f-3071-43e5-88ce-c9e22d408b2e-catalog-content\") pod \"redhat-marketplace-48nwt\" (UID: \"7174606f-3071-43e5-88ce-c9e22d408b2e\") " pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.893021 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfcz\" (UniqueName: \"kubernetes.io/projected/7174606f-3071-43e5-88ce-c9e22d408b2e-kube-api-access-shfcz\") pod \"redhat-marketplace-48nwt\" (UID: \"7174606f-3071-43e5-88ce-c9e22d408b2e\") " pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.993198 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m6sll"] Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.993585 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7174606f-3071-43e5-88ce-c9e22d408b2e-catalog-content\") pod \"redhat-marketplace-48nwt\" (UID: \"7174606f-3071-43e5-88ce-c9e22d408b2e\") " pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.993644 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfcz\" (UniqueName: \"kubernetes.io/projected/7174606f-3071-43e5-88ce-c9e22d408b2e-kube-api-access-shfcz\") pod \"redhat-marketplace-48nwt\" (UID: \"7174606f-3071-43e5-88ce-c9e22d408b2e\") " pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.993689 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7174606f-3071-43e5-88ce-c9e22d408b2e-utilities\") pod \"redhat-marketplace-48nwt\" (UID: \"7174606f-3071-43e5-88ce-c9e22d408b2e\") " pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.994332 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.994359 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7174606f-3071-43e5-88ce-c9e22d408b2e-utilities\") pod \"redhat-marketplace-48nwt\" (UID: \"7174606f-3071-43e5-88ce-c9e22d408b2e\") " pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.994390 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7174606f-3071-43e5-88ce-c9e22d408b2e-catalog-content\") pod \"redhat-marketplace-48nwt\" (UID: \"7174606f-3071-43e5-88ce-c9e22d408b2e\") " pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:44 crc kubenswrapper[4585]: I0215 17:09:44.997167 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.009566 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6sll"] Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.072028 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfcz\" (UniqueName: \"kubernetes.io/projected/7174606f-3071-43e5-88ce-c9e22d408b2e-kube-api-access-shfcz\") pod \"redhat-marketplace-48nwt\" (UID: \"7174606f-3071-43e5-88ce-c9e22d408b2e\") " pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.096387 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfbc5\" (UniqueName: \"kubernetes.io/projected/f615e254-695a-452c-86c3-312b4dbabdb1-kube-api-access-wfbc5\") pod \"redhat-operators-m6sll\" (UID: \"f615e254-695a-452c-86c3-312b4dbabdb1\") " pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.096546 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f615e254-695a-452c-86c3-312b4dbabdb1-utilities\") pod \"redhat-operators-m6sll\" (UID: \"f615e254-695a-452c-86c3-312b4dbabdb1\") " pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.096663 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f615e254-695a-452c-86c3-312b4dbabdb1-catalog-content\") pod \"redhat-operators-m6sll\" (UID: \"f615e254-695a-452c-86c3-312b4dbabdb1\") " pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.112998 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.205450 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f615e254-695a-452c-86c3-312b4dbabdb1-catalog-content\") pod \"redhat-operators-m6sll\" (UID: \"f615e254-695a-452c-86c3-312b4dbabdb1\") " pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.206694 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfbc5\" (UniqueName: \"kubernetes.io/projected/f615e254-695a-452c-86c3-312b4dbabdb1-kube-api-access-wfbc5\") pod \"redhat-operators-m6sll\" (UID: \"f615e254-695a-452c-86c3-312b4dbabdb1\") " pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.206761 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f615e254-695a-452c-86c3-312b4dbabdb1-utilities\") pod \"redhat-operators-m6sll\" (UID: \"f615e254-695a-452c-86c3-312b4dbabdb1\") " pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.207013 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f615e254-695a-452c-86c3-312b4dbabdb1-utilities\") pod \"redhat-operators-m6sll\" (UID: \"f615e254-695a-452c-86c3-312b4dbabdb1\") " pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.206149 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f615e254-695a-452c-86c3-312b4dbabdb1-catalog-content\") pod \"redhat-operators-m6sll\" (UID: \"f615e254-695a-452c-86c3-312b4dbabdb1\") " pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.224896 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfbc5\" (UniqueName: \"kubernetes.io/projected/f615e254-695a-452c-86c3-312b4dbabdb1-kube-api-access-wfbc5\") pod \"redhat-operators-m6sll\" (UID: \"f615e254-695a-452c-86c3-312b4dbabdb1\") " pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.302774 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48nwt"] Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.366501 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.595775 4585 generic.go:334] "Generic (PLEG): container finished" podID="7174606f-3071-43e5-88ce-c9e22d408b2e" containerID="a32049497c741e73dbe351a2bdc85d35bf0e9adff7e27bbf4bfb69272c6e9920" exitCode=0 Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.596472 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nwt" event={"ID":"7174606f-3071-43e5-88ce-c9e22d408b2e","Type":"ContainerDied","Data":"a32049497c741e73dbe351a2bdc85d35bf0e9adff7e27bbf4bfb69272c6e9920"} Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.596496 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nwt" event={"ID":"7174606f-3071-43e5-88ce-c9e22d408b2e","Type":"ContainerStarted","Data":"6f659c35c1632bad4266f6a6346d962ef220404626d74a06ab013eb5dd444803"} Feb 15 17:09:45 crc kubenswrapper[4585]: I0215 17:09:45.740120 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6sll"] Feb 15 17:09:45 crc kubenswrapper[4585]: W0215 17:09:45.747321 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf615e254_695a_452c_86c3_312b4dbabdb1.slice/crio-f5dbb63297ffe05057be9b038b6538d882b1f386eef9e4756c564cb9f015c4fa WatchSource:0}: Error finding container f5dbb63297ffe05057be9b038b6538d882b1f386eef9e4756c564cb9f015c4fa: Status 404 returned error can't find the container with id f5dbb63297ffe05057be9b038b6538d882b1f386eef9e4756c564cb9f015c4fa Feb 15 17:09:46 crc kubenswrapper[4585]: I0215 17:09:46.601457 4585 generic.go:334] "Generic (PLEG): container finished" podID="7174606f-3071-43e5-88ce-c9e22d408b2e" containerID="b9fef91d4210446daf1216de726736387c4460e18e3810089b54f1a0db6d12c2" exitCode=0 Feb 15 17:09:46 crc kubenswrapper[4585]: I0215 17:09:46.602254 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nwt" event={"ID":"7174606f-3071-43e5-88ce-c9e22d408b2e","Type":"ContainerDied","Data":"b9fef91d4210446daf1216de726736387c4460e18e3810089b54f1a0db6d12c2"} Feb 15 17:09:46 crc kubenswrapper[4585]: I0215 17:09:46.604869 4585 generic.go:334] "Generic (PLEG): container finished" podID="f615e254-695a-452c-86c3-312b4dbabdb1" containerID="a0ad06537b950848beaf74b1ac3308d32f302bf3b2b5478d6b185d6789373c49" exitCode=0 Feb 15 17:09:46 crc kubenswrapper[4585]: I0215 17:09:46.605786 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6sll" event={"ID":"f615e254-695a-452c-86c3-312b4dbabdb1","Type":"ContainerDied","Data":"a0ad06537b950848beaf74b1ac3308d32f302bf3b2b5478d6b185d6789373c49"} Feb 15 17:09:46 crc kubenswrapper[4585]: I0215 17:09:46.605816 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6sll" event={"ID":"f615e254-695a-452c-86c3-312b4dbabdb1","Type":"ContainerStarted","Data":"f5dbb63297ffe05057be9b038b6538d882b1f386eef9e4756c564cb9f015c4fa"} Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.191071 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rvmlr"] Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.192205 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.198008 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.228862 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dc013a-2295-442e-b092-e7735bd01de9-catalog-content\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.228912 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxqpd\" (UniqueName: \"kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.228948 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dc013a-2295-442e-b092-e7735bd01de9-utilities\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.251374 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvmlr"] Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.251694 4585 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.252940 4585 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.253272 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.253421 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711" gracePeriod=15 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.253433 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a" gracePeriod=15 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.253492 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c" gracePeriod=15 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.253503 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193" gracePeriod=15 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.253360 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f" gracePeriod=15 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299071 4585 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.299751 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299766 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.299775 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299781 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.299788 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299795 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.299804 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299810 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.299817 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299823 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.299834 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299840 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.299849 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299854 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.299860 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299868 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299949 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299959 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299968 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299976 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299984 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.299992 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.300000 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.307943 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.334966 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335009 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335038 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqpd\" (UniqueName: \"kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335058 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335090 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335116 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335131 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335208 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dc013a-2295-442e-b092-e7735bd01de9-utilities\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335237 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335256 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335290 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dc013a-2295-442e-b092-e7735bd01de9-catalog-content\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.335821 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65dc013a-2295-442e-b092-e7735bd01de9-catalog-content\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.336196 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65dc013a-2295-442e-b092-e7735bd01de9-utilities\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.340728 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cxqpd for pod openshift-marketplace/certified-operators-rvmlr: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.340825 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd podName:65dc013a-2295-442e-b092-e7735bd01de9 nodeName:}" failed. No retries permitted until 2026-02-15 17:09:47.840804211 +0000 UTC m=+243.784212333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cxqpd" (UniqueName: "kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd") pod "certified-operators-rvmlr" (UID: "65dc013a-2295-442e-b092-e7735bd01de9") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.427995 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193.scope\": RecentStats: unable to find data in memory cache]" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.436645 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.436726 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.436762 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.436792 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.436816 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.436841 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.436863 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.436904 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.436991 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.437721 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.437783 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.437802 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.437823 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.437846 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.437868 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.437876 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.604392 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.617558 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.618562 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.619172 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711" exitCode=0 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.619197 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c" exitCode=0 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.619206 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a" exitCode=0 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.619216 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193" exitCode=2 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.619289 4585 scope.go:117] "RemoveContainer" containerID="0b3a6b8d1e780128d16c8aec2a6e2ea4f6224a50ba23159c666972c4da859ce4" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.622456 4585 generic.go:334] "Generic (PLEG): container finished" podID="83fd11a6-03b9-427b-8678-ba01bad122cd" containerID="c9c6bbfe5bc5735e9f1d830c37c41316e595d052e7d6d095743a537dd82da747" exitCode=0 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.622564 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"83fd11a6-03b9-427b-8678-ba01bad122cd","Type":"ContainerDied","Data":"c9c6bbfe5bc5735e9f1d830c37c41316e595d052e7d6d095743a537dd82da747"} Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.624184 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.624452 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.624822 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.625943 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48nwt" event={"ID":"7174606f-3071-43e5-88ce-c9e22d408b2e","Type":"ContainerStarted","Data":"6ff3aa919d3b614b4e3447fc91e2238086060b81c5d74e533b88e2153ef765ef"} Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.626716 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.627040 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.627312 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.627496 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.628177 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6sll" event={"ID":"f615e254-695a-452c-86c3-312b4dbabdb1","Type":"ContainerStarted","Data":"572b4c74033de15edb0aff2ec1ebf621b0865da274467b81476b6d6f702885d0"} Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.629078 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.629239 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.629428 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.629622 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.629769 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:47 crc kubenswrapper[4585]: W0215 17:09:47.633048 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-560ec0cc49f87d754ce70855a04a671c34b5c14bbda59b68025d0a1aaa4ded89 WatchSource:0}: Error finding container 560ec0cc49f87d754ce70855a04a671c34b5c14bbda59b68025d0a1aaa4ded89: Status 404 returned error can't find the container with id 560ec0cc49f87d754ce70855a04a671c34b5c14bbda59b68025d0a1aaa4ded89 Feb 15 17:09:47 crc kubenswrapper[4585]: I0215 17:09:47.842530 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqpd\" (UniqueName: \"kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.843439 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cxqpd for pod openshift-marketplace/certified-operators-rvmlr: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:47 crc kubenswrapper[4585]: E0215 17:09:47.843518 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd podName:65dc013a-2295-442e-b092-e7735bd01de9 nodeName:}" failed. No retries permitted until 2026-02-15 17:09:48.843496205 +0000 UTC m=+244.786904337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxqpd" (UniqueName: "kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd") pod "certified-operators-rvmlr" (UID: "65dc013a-2295-442e-b092-e7735bd01de9") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.638743 4585 generic.go:334] "Generic (PLEG): container finished" podID="f615e254-695a-452c-86c3-312b4dbabdb1" containerID="572b4c74033de15edb0aff2ec1ebf621b0865da274467b81476b6d6f702885d0" exitCode=0 Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.638802 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6sll" event={"ID":"f615e254-695a-452c-86c3-312b4dbabdb1","Type":"ContainerDied","Data":"572b4c74033de15edb0aff2ec1ebf621b0865da274467b81476b6d6f702885d0"} Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.639715 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.640853 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.641415 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.641756 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.642306 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.643949 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146"} Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.643996 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"560ec0cc49f87d754ce70855a04a671c34b5c14bbda59b68025d0a1aaa4ded89"} Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.645287 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.645987 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.646762 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.647197 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.649765 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.653442 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.658095 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.658152 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.854265 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqpd\" (UniqueName: \"kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:48 crc kubenswrapper[4585]: E0215 17:09:48.855342 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cxqpd for pod openshift-marketplace/certified-operators-rvmlr: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:48 crc kubenswrapper[4585]: E0215 17:09:48.855491 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd podName:65dc013a-2295-442e-b092-e7735bd01de9 nodeName:}" failed. No retries permitted until 2026-02-15 17:09:50.855453057 +0000 UTC m=+246.798861239 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxqpd" (UniqueName: "kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd") pod "certified-operators-rvmlr" (UID: "65dc013a-2295-442e-b092-e7735bd01de9") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.926061 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.927022 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.927485 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.927981 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:48 crc kubenswrapper[4585]: I0215 17:09:48.928404 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.057588 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-kubelet-dir\") pod \"83fd11a6-03b9-427b-8678-ba01bad122cd\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.057706 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fd11a6-03b9-427b-8678-ba01bad122cd-kube-api-access\") pod \"83fd11a6-03b9-427b-8678-ba01bad122cd\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.057763 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-var-lock\") pod \"83fd11a6-03b9-427b-8678-ba01bad122cd\" (UID: \"83fd11a6-03b9-427b-8678-ba01bad122cd\") " Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.057872 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "83fd11a6-03b9-427b-8678-ba01bad122cd" (UID: "83fd11a6-03b9-427b-8678-ba01bad122cd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.058075 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-var-lock" (OuterVolumeSpecName: "var-lock") pod "83fd11a6-03b9-427b-8678-ba01bad122cd" (UID: "83fd11a6-03b9-427b-8678-ba01bad122cd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.058140 4585 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.066368 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83fd11a6-03b9-427b-8678-ba01bad122cd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "83fd11a6-03b9-427b-8678-ba01bad122cd" (UID: "83fd11a6-03b9-427b-8678-ba01bad122cd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.160039 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83fd11a6-03b9-427b-8678-ba01bad122cd-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.160150 4585 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/83fd11a6-03b9-427b-8678-ba01bad122cd-var-lock\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.660455 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.662011 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6sll" event={"ID":"f615e254-695a-452c-86c3-312b4dbabdb1","Type":"ContainerStarted","Data":"a794531f8d5aee43aa40a33bcf5f0208b78f4735d20412ecaa5b5b87a06089e2"} Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.662292 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.662381 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.663047 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.663765 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.663988 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.664300 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.664529 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.664853 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.665019 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.665928 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.667826 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.667902 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f" exitCode=0 Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.667942 4585 scope.go:117] "RemoveContainer" containerID="ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.670450 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.670440 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"83fd11a6-03b9-427b-8678-ba01bad122cd","Type":"ContainerDied","Data":"1d76e22afc68ff9d5824c8579369ffe23abe61d159c1734dea3705b1c83acedd"} Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.670591 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d76e22afc68ff9d5824c8579369ffe23abe61d159c1734dea3705b1c83acedd" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.681939 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.682167 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.682361 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.682579 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.683029 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.685495 4585 scope.go:117] "RemoveContainer" containerID="9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.699469 4585 scope.go:117] "RemoveContainer" containerID="997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.717810 4585 scope.go:117] "RemoveContainer" containerID="a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.731463 4585 scope.go:117] "RemoveContainer" containerID="b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.745015 4585 scope.go:117] "RemoveContainer" containerID="5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.764996 4585 scope.go:117] "RemoveContainer" containerID="ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711" Feb 15 17:09:49 crc kubenswrapper[4585]: E0215 17:09:49.765545 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\": container with ID starting with ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711 not found: ID does not exist" containerID="ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.765580 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711"} err="failed to get container status \"ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\": rpc error: code = NotFound desc = could not find container \"ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711\": container with ID starting with ee819d29d9d93d4499d4024c0755c5e0882ec45337b8e1f24e84463dd200e711 not found: ID does not exist" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.765615 4585 scope.go:117] "RemoveContainer" containerID="9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c" Feb 15 17:09:49 crc kubenswrapper[4585]: E0215 17:09:49.766736 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\": container with ID starting with 9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c not found: ID does not exist" containerID="9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.766765 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c"} err="failed to get container status \"9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\": rpc error: code = NotFound desc = could not find container \"9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c\": container with ID starting with 9d24ea0931bef1bf571295eeb92f41e0016406a040c3d9331831bca1a4d6934c not found: ID does not exist" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.766785 4585 scope.go:117] "RemoveContainer" containerID="997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.766988 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767026 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767052 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767074 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767105 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767209 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:09:49 crc kubenswrapper[4585]: E0215 17:09:49.767256 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\": container with ID starting with 997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a not found: ID does not exist" containerID="997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767272 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a"} err="failed to get container status \"997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\": rpc error: code = NotFound desc = could not find container \"997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a\": container with ID starting with 997198c77dec2238970c8afd2a6ea3c27207644abf85dbb5b135a8be5c00c86a not found: ID does not exist" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767283 4585 scope.go:117] "RemoveContainer" containerID="a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767361 4585 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767372 4585 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767382 4585 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:09:49 crc kubenswrapper[4585]: E0215 17:09:49.767540 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\": container with ID starting with a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193 not found: ID does not exist" containerID="a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767561 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193"} err="failed to get container status \"a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\": rpc error: code = NotFound desc = could not find container \"a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193\": container with ID starting with a58011618f3d5e85defb1c0f123e66f669002c87831cc99048280fc5fc5d2193 not found: ID does not exist" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767572 4585 scope.go:117] "RemoveContainer" containerID="b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f" Feb 15 17:09:49 crc kubenswrapper[4585]: E0215 17:09:49.767857 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\": container with ID starting with b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f not found: ID does not exist" containerID="b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767885 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f"} err="failed to get container status \"b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\": rpc error: code = NotFound desc = could not find container \"b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f\": container with ID starting with b6e07a3e82be60b20455c612d64aef3d31d4d41d5152c8b6895abdc2d4525b5f not found: ID does not exist" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.767902 4585 scope.go:117] "RemoveContainer" containerID="5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789" Feb 15 17:09:49 crc kubenswrapper[4585]: E0215 17:09:49.768690 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\": container with ID starting with 5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789 not found: ID does not exist" containerID="5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789" Feb 15 17:09:49 crc kubenswrapper[4585]: I0215 17:09:49.768732 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789"} err="failed to get container status \"5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\": rpc error: code = NotFound desc = could not find container \"5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789\": container with ID starting with 5ff1225b6d23afaab961c75904d8b41facbbfd33ead1fb112f31fdf88d721789 not found: ID does not exist" Feb 15 17:09:50 crc kubenswrapper[4585]: I0215 17:09:50.676220 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:50 crc kubenswrapper[4585]: I0215 17:09:50.697013 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:50 crc kubenswrapper[4585]: I0215 17:09:50.697345 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:50 crc kubenswrapper[4585]: I0215 17:09:50.697656 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:50 crc kubenswrapper[4585]: I0215 17:09:50.697897 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:50 crc kubenswrapper[4585]: I0215 17:09:50.698111 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:50 crc kubenswrapper[4585]: I0215 17:09:50.849439 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 15 17:09:50 crc kubenswrapper[4585]: I0215 17:09:50.881679 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqpd\" (UniqueName: \"kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:50 crc kubenswrapper[4585]: E0215 17:09:50.882334 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cxqpd for pod openshift-marketplace/certified-operators-rvmlr: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:50 crc kubenswrapper[4585]: E0215 17:09:50.882409 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd podName:65dc013a-2295-442e-b092-e7735bd01de9 nodeName:}" failed. No retries permitted until 2026-02-15 17:09:54.882385881 +0000 UTC m=+250.825794073 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxqpd" (UniqueName: "kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd") pod "certified-operators-rvmlr" (UID: "65dc013a-2295-442e-b092-e7735bd01de9") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:52 crc kubenswrapper[4585]: E0215 17:09:52.323874 4585 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.190:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-m6sll.18947aa1742dc04d openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-m6sll,UID:f615e254-695a-452c-86c3-312b4dbabdb1,APIVersion:v1,ResourceVersion:29631,FieldPath:spec.initContainers{extract-content},},Reason:Created,Message:Created container extract-content,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-15 17:09:47.320926285 +0000 UTC m=+243.264334417,LastTimestamp:2026-02-15 17:09:47.320926285 +0000 UTC m=+243.264334417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 15 17:09:53 crc kubenswrapper[4585]: E0215 17:09:53.552096 4585 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.190:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-m6sll.18947aa1742dc04d openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-m6sll,UID:f615e254-695a-452c-86c3-312b4dbabdb1,APIVersion:v1,ResourceVersion:29631,FieldPath:spec.initContainers{extract-content},},Reason:Created,Message:Created container extract-content,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-15 17:09:47.320926285 +0000 UTC m=+243.264334417,LastTimestamp:2026-02-15 17:09:47.320926285 +0000 UTC m=+243.264334417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 15 17:09:54 crc kubenswrapper[4585]: I0215 17:09:54.848758 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:54 crc kubenswrapper[4585]: I0215 17:09:54.849427 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:54 crc kubenswrapper[4585]: I0215 17:09:54.849951 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:54 crc kubenswrapper[4585]: I0215 17:09:54.850126 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:54 crc kubenswrapper[4585]: I0215 17:09:54.934633 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqpd\" (UniqueName: \"kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:09:54 crc kubenswrapper[4585]: E0215 17:09:54.935824 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cxqpd for pod openshift-marketplace/certified-operators-rvmlr: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:54 crc kubenswrapper[4585]: E0215 17:09:54.935954 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd podName:65dc013a-2295-442e-b092-e7735bd01de9 nodeName:}" failed. No retries permitted until 2026-02-15 17:10:02.93591849 +0000 UTC m=+258.879326662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxqpd" (UniqueName: "kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd") pod "certified-operators-rvmlr" (UID: "65dc013a-2295-442e-b092-e7735bd01de9") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/certified-operators/token": dial tcp 38.102.83.190:6443: connect: connection refused Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.113767 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.113823 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.158396 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.158994 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.159450 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.160152 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.160494 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.366876 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.366936 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:09:55 crc kubenswrapper[4585]: E0215 17:09:55.522183 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: E0215 17:09:55.522625 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: E0215 17:09:55.522900 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: E0215 17:09:55.523260 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: E0215 17:09:55.523764 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.523798 4585 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 15 17:09:55 crc kubenswrapper[4585]: E0215 17:09:55.524037 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="200ms" Feb 15 17:09:55 crc kubenswrapper[4585]: E0215 17:09:55.725420 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="400ms" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.768221 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-48nwt" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.769002 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.769389 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.769721 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:55 crc kubenswrapper[4585]: I0215 17:09:55.770078 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:56 crc kubenswrapper[4585]: E0215 17:09:56.126647 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="800ms" Feb 15 17:09:56 crc kubenswrapper[4585]: I0215 17:09:56.426478 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m6sll" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" containerName="registry-server" probeResult="failure" output=< Feb 15 17:09:56 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:09:56 crc kubenswrapper[4585]: > Feb 15 17:09:56 crc kubenswrapper[4585]: E0215 17:09:56.928439 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="1.6s" Feb 15 17:09:58 crc kubenswrapper[4585]: E0215 17:09:58.529894 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.190:6443: connect: connection refused" interval="3.2s" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.766517 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.766667 4585 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4" exitCode=1 Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.766933 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4"} Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.767689 4585 scope.go:117] "RemoveContainer" containerID="ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.768110 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.768719 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.769246 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.769700 4585 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.770122 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.841460 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.842964 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.843169 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.843372 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.843624 4585 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.843831 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.862062 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.862096 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:09:59 crc kubenswrapper[4585]: E0215 17:09:59.862665 4585 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:59 crc kubenswrapper[4585]: I0215 17:09:59.863172 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:09:59 crc kubenswrapper[4585]: W0215 17:09:59.904350 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-67fbfd1e6f4c46302d0e7d1c152877d4b75363ed73c8153f29f4f8729ac7f559 WatchSource:0}: Error finding container 67fbfd1e6f4c46302d0e7d1c152877d4b75363ed73c8153f29f4f8729ac7f559: Status 404 returned error can't find the container with id 67fbfd1e6f4c46302d0e7d1c152877d4b75363ed73c8153f29f4f8729ac7f559 Feb 15 17:09:59 crc kubenswrapper[4585]: E0215 17:09:59.923526 4585 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.190:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" volumeName="registry-storage" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.778421 4585 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1ab25c19f6c5679bd8abf0f324fcc6d99129b4fc35d22a978776d9c6c1c03bd7" exitCode=0 Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.778543 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1ab25c19f6c5679bd8abf0f324fcc6d99129b4fc35d22a978776d9c6c1c03bd7"} Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.778774 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"67fbfd1e6f4c46302d0e7d1c152877d4b75363ed73c8153f29f4f8729ac7f559"} Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.779044 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.779061 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:10:00 crc kubenswrapper[4585]: E0215 17:10:00.779984 4585 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.779993 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.780448 4585 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.780753 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.781094 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.781446 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.785732 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.785905 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cf58475869fcc5c135583d246a4aa500d5df951c911c78a96443a18fc62912e2"} Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.787554 4585 status_manager.go:851] "Failed to get status for pod" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.788055 4585 status_manager.go:851] "Failed to get status for pod" podUID="f615e254-695a-452c-86c3-312b4dbabdb1" pod="openshift-marketplace/redhat-operators-m6sll" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-m6sll\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.788549 4585 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.788993 4585 status_manager.go:851] "Failed to get status for pod" podUID="7174606f-3071-43e5-88ce-c9e22d408b2e" pod="openshift-marketplace/redhat-marketplace-48nwt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-48nwt\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:00 crc kubenswrapper[4585]: I0215 17:10:00.789484 4585 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.190:6443: connect: connection refused" Feb 15 17:10:01 crc kubenswrapper[4585]: I0215 17:10:01.810137 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1c035c2ea0e698084e0bc3a4fdc36523fbe00c5e562eede10425e73da1a433a4"} Feb 15 17:10:01 crc kubenswrapper[4585]: I0215 17:10:01.810384 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"69e5fb4f6949767d1945b4110960e6d3339b3d00242ad5b2a3843147f45dd155"} Feb 15 17:10:01 crc kubenswrapper[4585]: I0215 17:10:01.810398 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"288d1e5ea0803512a0632181bc03b8d97825c348d67f3cefa2b074f4f7e270c5"} Feb 15 17:10:01 crc kubenswrapper[4585]: I0215 17:10:01.810408 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2362316ee2bda4309eb8fc8b4d6f3fe24eb50b4b4280a702d48a368423168b57"} Feb 15 17:10:02 crc kubenswrapper[4585]: I0215 17:10:02.272949 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:10:02 crc kubenswrapper[4585]: I0215 17:10:02.818410 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8f9a3ff82358dd95177c26dc979cec5217ae003906d4aa6d8bea7ea2e55117ab"} Feb 15 17:10:02 crc kubenswrapper[4585]: I0215 17:10:02.818920 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:10:02 crc kubenswrapper[4585]: I0215 17:10:02.818962 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:10:02 crc kubenswrapper[4585]: I0215 17:10:02.963418 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqpd\" (UniqueName: \"kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:10:04 crc kubenswrapper[4585]: I0215 17:10:04.864873 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:10:04 crc kubenswrapper[4585]: I0215 17:10:04.865330 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:10:04 crc kubenswrapper[4585]: I0215 17:10:04.873250 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:10:05 crc kubenswrapper[4585]: I0215 17:10:05.415687 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:10:05 crc kubenswrapper[4585]: I0215 17:10:05.468039 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m6sll" Feb 15 17:10:05 crc kubenswrapper[4585]: I0215 17:10:05.593672 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:10:05 crc kubenswrapper[4585]: I0215 17:10:05.594222 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 15 17:10:05 crc kubenswrapper[4585]: I0215 17:10:05.594318 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 15 17:10:07 crc kubenswrapper[4585]: I0215 17:10:07.832343 4585 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:10:07 crc kubenswrapper[4585]: I0215 17:10:07.851921 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:10:07 crc kubenswrapper[4585]: I0215 17:10:07.851979 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:10:07 crc kubenswrapper[4585]: I0215 17:10:07.852001 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:10:07 crc kubenswrapper[4585]: I0215 17:10:07.855308 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:10:07 crc kubenswrapper[4585]: I0215 17:10:07.948375 4585 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eceb43e9-fd25-4d79-9ab6-b35c2fc99ff8" Feb 15 17:10:07 crc kubenswrapper[4585]: I0215 17:10:07.986539 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxqpd\" (UniqueName: \"kubernetes.io/projected/65dc013a-2295-442e-b092-e7735bd01de9-kube-api-access-cxqpd\") pod \"certified-operators-rvmlr\" (UID: \"65dc013a-2295-442e-b092-e7735bd01de9\") " pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:10:08 crc kubenswrapper[4585]: I0215 17:10:08.238632 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:10:08 crc kubenswrapper[4585]: I0215 17:10:08.857190 4585 generic.go:334] "Generic (PLEG): container finished" podID="65dc013a-2295-442e-b092-e7735bd01de9" containerID="99dae15a32ddd0ec4306f8508875a5a3b4decc2ed880ad73c36cb4a413e1dfa6" exitCode=0 Feb 15 17:10:08 crc kubenswrapper[4585]: I0215 17:10:08.857296 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmlr" event={"ID":"65dc013a-2295-442e-b092-e7735bd01de9","Type":"ContainerDied","Data":"99dae15a32ddd0ec4306f8508875a5a3b4decc2ed880ad73c36cb4a413e1dfa6"} Feb 15 17:10:08 crc kubenswrapper[4585]: I0215 17:10:08.858665 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmlr" event={"ID":"65dc013a-2295-442e-b092-e7735bd01de9","Type":"ContainerStarted","Data":"acb4416738cc25f79409ea41234c3e2c9d93fb6f6f7efc473857828d3f1a47fc"} Feb 15 17:10:08 crc kubenswrapper[4585]: I0215 17:10:08.858933 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:10:08 crc kubenswrapper[4585]: I0215 17:10:08.859013 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="51e5550d-fe21-4438-9d5a-7d4169075c94" Feb 15 17:10:08 crc kubenswrapper[4585]: I0215 17:10:08.876741 4585 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eceb43e9-fd25-4d79-9ab6-b35c2fc99ff8" Feb 15 17:10:09 crc kubenswrapper[4585]: I0215 17:10:09.869760 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmlr" event={"ID":"65dc013a-2295-442e-b092-e7735bd01de9","Type":"ContainerStarted","Data":"2299ac343e21edd6c86918b157a67adac445556882b1e54f0d0fe1abddba5504"} Feb 15 17:10:10 crc kubenswrapper[4585]: I0215 17:10:10.876662 4585 generic.go:334] "Generic (PLEG): container finished" podID="65dc013a-2295-442e-b092-e7735bd01de9" containerID="2299ac343e21edd6c86918b157a67adac445556882b1e54f0d0fe1abddba5504" exitCode=0 Feb 15 17:10:10 crc kubenswrapper[4585]: I0215 17:10:10.876753 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmlr" event={"ID":"65dc013a-2295-442e-b092-e7735bd01de9","Type":"ContainerDied","Data":"2299ac343e21edd6c86918b157a67adac445556882b1e54f0d0fe1abddba5504"} Feb 15 17:10:11 crc kubenswrapper[4585]: I0215 17:10:11.883839 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvmlr" event={"ID":"65dc013a-2295-442e-b092-e7735bd01de9","Type":"ContainerStarted","Data":"093636b5edff5638558a03783296808719f423eb95ed9ef1bb34aeacbef80642"} Feb 15 17:10:15 crc kubenswrapper[4585]: I0215 17:10:15.593974 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 15 17:10:15 crc kubenswrapper[4585]: I0215 17:10:15.594588 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 15 17:10:18 crc kubenswrapper[4585]: I0215 17:10:18.074354 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 15 17:10:18 crc kubenswrapper[4585]: I0215 17:10:18.239717 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:10:18 crc kubenswrapper[4585]: I0215 17:10:18.240088 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:10:18 crc kubenswrapper[4585]: I0215 17:10:18.315715 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:10:18 crc kubenswrapper[4585]: I0215 17:10:18.599143 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 15 17:10:18 crc kubenswrapper[4585]: I0215 17:10:18.642797 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 15 17:10:18 crc kubenswrapper[4585]: I0215 17:10:18.988655 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rvmlr" Feb 15 17:10:19 crc kubenswrapper[4585]: I0215 17:10:19.159736 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 15 17:10:19 crc kubenswrapper[4585]: I0215 17:10:19.239478 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 15 17:10:19 crc kubenswrapper[4585]: I0215 17:10:19.319685 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 15 17:10:19 crc kubenswrapper[4585]: I0215 17:10:19.330645 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 15 17:10:19 crc kubenswrapper[4585]: I0215 17:10:19.499124 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 15 17:10:19 crc kubenswrapper[4585]: I0215 17:10:19.982141 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.056827 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.272655 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.418262 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.442342 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.510932 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.562501 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.650856 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.848546 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.886582 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.925302 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 15 17:10:20 crc kubenswrapper[4585]: I0215 17:10:20.926044 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.022852 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.109886 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.141587 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.178414 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.357415 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.383128 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.496151 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.562278 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.571279 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.643692 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.780917 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.863478 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.899831 4585 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 15 17:10:21 crc kubenswrapper[4585]: I0215 17:10:21.926896 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.147137 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.154029 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.184935 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.223474 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.285154 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.293076 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.337576 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.357153 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.374564 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.381035 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.470866 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.479042 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.707947 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.742091 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.745220 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.756670 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.764653 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.897378 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 15 17:10:22 crc kubenswrapper[4585]: I0215 17:10:22.928149 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.066122 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.080798 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.090176 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.185291 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.203825 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.242831 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.308182 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.332718 4585 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.349664 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.389905 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.563777 4585 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.600863 4585 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.623263 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.638515 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.754649 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.903855 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.924873 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 15 17:10:23 crc kubenswrapper[4585]: I0215 17:10:23.983589 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.014516 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.032583 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.133340 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.231334 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.231971 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.270782 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.279017 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.307790 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.324219 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.369559 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.375272 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.507431 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.528873 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.602295 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.614252 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.649977 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.711673 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.969303 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 15 17:10:24 crc kubenswrapper[4585]: I0215 17:10:24.996771 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.012630 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.044874 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.076728 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.093947 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.113045 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.201189 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.356858 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.372078 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.413939 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.457121 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.458704 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.477983 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.556763 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.586966 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.593669 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.593753 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.593823 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.594999 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"cf58475869fcc5c135583d246a4aa500d5df951c911c78a96443a18fc62912e2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.595330 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://cf58475869fcc5c135583d246a4aa500d5df951c911c78a96443a18fc62912e2" gracePeriod=30 Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.788125 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.809754 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.972841 4585 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.977219 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 15 17:10:25 crc kubenswrapper[4585]: I0215 17:10:25.985702 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.132144 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.162333 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.183137 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.256108 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.298859 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.301816 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.372570 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.386428 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.386591 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.391227 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.413964 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.483809 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.491372 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.510664 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.545278 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.640112 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.676741 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.691301 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.789460 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.824547 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 15 17:10:26 crc kubenswrapper[4585]: I0215 17:10:26.986463 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.035256 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.133812 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.142056 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.152715 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.220127 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.269844 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.321837 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.351163 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.388052 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.395895 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.442924 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.500211 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.509592 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.531816 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.694846 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.707155 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.755248 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.870334 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.895540 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.928184 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 15 17:10:27 crc kubenswrapper[4585]: I0215 17:10:27.954825 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.037823 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.075077 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.160403 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.321434 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.333627 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.372507 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.381002 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.447212 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.633706 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.634110 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.680203 4585 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.681721 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m6sll" podStartSLOduration=42.178068815 podStartE2EDuration="44.681705896s" podCreationTimestamp="2026-02-15 17:09:44 +0000 UTC" firstStartedPulling="2026-02-15 17:09:46.606251125 +0000 UTC m=+242.549659257" lastFinishedPulling="2026-02-15 17:09:49.109888206 +0000 UTC m=+245.053296338" observedRunningTime="2026-02-15 17:10:07.904718128 +0000 UTC m=+263.848126270" watchObservedRunningTime="2026-02-15 17:10:28.681705896 +0000 UTC m=+284.625114038" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.693974 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=41.693948963 podStartE2EDuration="41.693948963s" podCreationTimestamp="2026-02-15 17:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:10:07.860938735 +0000 UTC m=+263.804346867" watchObservedRunningTime="2026-02-15 17:10:28.693948963 +0000 UTC m=+284.637357115" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.694325 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rvmlr" podStartSLOduration=39.165450312 podStartE2EDuration="41.694316893s" podCreationTimestamp="2026-02-15 17:09:47 +0000 UTC" firstStartedPulling="2026-02-15 17:10:08.858658322 +0000 UTC m=+264.802066464" lastFinishedPulling="2026-02-15 17:10:11.387524883 +0000 UTC m=+267.330933045" observedRunningTime="2026-02-15 17:10:11.899971457 +0000 UTC m=+267.843379599" watchObservedRunningTime="2026-02-15 17:10:28.694316893 +0000 UTC m=+284.637725035" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.694811 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-48nwt" podStartSLOduration=43.295704296 podStartE2EDuration="44.694805767s" podCreationTimestamp="2026-02-15 17:09:44 +0000 UTC" firstStartedPulling="2026-02-15 17:09:45.598235037 +0000 UTC m=+241.541643169" lastFinishedPulling="2026-02-15 17:09:46.997336498 +0000 UTC m=+242.940744640" observedRunningTime="2026-02-15 17:10:07.847916536 +0000 UTC m=+263.791324668" watchObservedRunningTime="2026-02-15 17:10:28.694805767 +0000 UTC m=+284.638213909" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.696092 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.696136 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.696158 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvmlr"] Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.707358 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.710730 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.724237 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.724221652 podStartE2EDuration="21.724221652s" podCreationTimestamp="2026-02-15 17:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:10:28.720651751 +0000 UTC m=+284.664059883" watchObservedRunningTime="2026-02-15 17:10:28.724221652 +0000 UTC m=+284.667629784" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.778897 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.904659 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.955451 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 15 17:10:28 crc kubenswrapper[4585]: I0215 17:10:28.981676 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.048648 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.088071 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.107405 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.190526 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.235512 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.279418 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.315877 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.327307 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.344298 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.440356 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.518484 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.636057 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.695379 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.724982 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.740803 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.776139 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 15 17:10:29 crc kubenswrapper[4585]: I0215 17:10:29.992062 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.007039 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.029579 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.092826 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.190035 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.195637 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.241643 4585 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.241867 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146" gracePeriod=5 Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.298145 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.308726 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.340480 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.413759 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.438741 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.578836 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.623637 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.697002 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.759224 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.781830 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.810541 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 15 17:10:30 crc kubenswrapper[4585]: I0215 17:10:30.812106 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.038434 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.055800 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.063048 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.084434 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.138644 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.140857 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.227810 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.243790 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.282246 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.454630 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.477686 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.560071 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.577369 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.825015 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.835010 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.935561 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 15 17:10:31 crc kubenswrapper[4585]: I0215 17:10:31.996887 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.008156 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.027456 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.147090 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.315659 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.351132 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.382352 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.489663 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.563017 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.575079 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.588514 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.702503 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.748328 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 15 17:10:32 crc kubenswrapper[4585]: I0215 17:10:32.785039 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 15 17:10:33 crc kubenswrapper[4585]: I0215 17:10:33.116981 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 15 17:10:33 crc kubenswrapper[4585]: I0215 17:10:33.203384 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 15 17:10:33 crc kubenswrapper[4585]: I0215 17:10:33.313147 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 15 17:10:33 crc kubenswrapper[4585]: I0215 17:10:33.341959 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 15 17:10:33 crc kubenswrapper[4585]: I0215 17:10:33.362636 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 15 17:10:33 crc kubenswrapper[4585]: I0215 17:10:33.636946 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 15 17:10:33 crc kubenswrapper[4585]: I0215 17:10:33.872496 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 15 17:10:34 crc kubenswrapper[4585]: I0215 17:10:34.002098 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 15 17:10:34 crc kubenswrapper[4585]: I0215 17:10:34.121401 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 15 17:10:34 crc kubenswrapper[4585]: I0215 17:10:34.126023 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 15 17:10:34 crc kubenswrapper[4585]: I0215 17:10:34.132325 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 15 17:10:34 crc kubenswrapper[4585]: I0215 17:10:34.437362 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 15 17:10:34 crc kubenswrapper[4585]: I0215 17:10:34.550319 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 15 17:10:34 crc kubenswrapper[4585]: I0215 17:10:34.751314 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.036670 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.046632 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.391098 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.393231 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.429678 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578235 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578313 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578358 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578417 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578432 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578476 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578491 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578583 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578784 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578873 4585 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578898 4585 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578915 4585 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.578932 4585 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.590129 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:10:35 crc kubenswrapper[4585]: I0215 17:10:35.680272 4585 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.001231 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.031842 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.031929 4585 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146" exitCode=137 Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.032014 4585 scope.go:117] "RemoveContainer" containerID="b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.032053 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.055007 4585 scope.go:117] "RemoveContainer" containerID="b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146" Feb 15 17:10:36 crc kubenswrapper[4585]: E0215 17:10:36.055703 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146\": container with ID starting with b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146 not found: ID does not exist" containerID="b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.055911 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146"} err="failed to get container status \"b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146\": rpc error: code = NotFound desc = could not find container \"b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146\": container with ID starting with b80645439e785dee32ae8eca5c610e634224c23ce2c72201aedbaa165171b146 not found: ID does not exist" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.828878 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.853884 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.854413 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.868188 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.868249 4585 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8c1b289c-c5f6-4729-8ac1-b53abade38ed" Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.874899 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 15 17:10:36 crc kubenswrapper[4585]: I0215 17:10:36.874955 4585 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8c1b289c-c5f6-4729-8ac1-b53abade38ed" Feb 15 17:10:37 crc kubenswrapper[4585]: I0215 17:10:37.731360 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 15 17:10:44 crc kubenswrapper[4585]: I0215 17:10:44.579489 4585 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 15 17:10:55 crc kubenswrapper[4585]: I0215 17:10:55.996002 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6n9mz"] Feb 15 17:10:55 crc kubenswrapper[4585]: E0215 17:10:55.997041 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 15 17:10:55 crc kubenswrapper[4585]: I0215 17:10:55.997065 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 15 17:10:55 crc kubenswrapper[4585]: E0215 17:10:55.997087 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" containerName="installer" Feb 15 17:10:55 crc kubenswrapper[4585]: I0215 17:10:55.997100 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" containerName="installer" Feb 15 17:10:55 crc kubenswrapper[4585]: I0215 17:10:55.997282 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="83fd11a6-03b9-427b-8678-ba01bad122cd" containerName="installer" Feb 15 17:10:55 crc kubenswrapper[4585]: I0215 17:10:55.997310 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 15 17:10:55 crc kubenswrapper[4585]: I0215 17:10:55.998643 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.004741 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.024864 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944cfefd-484c-48e3-9f72-0ee184944d34-catalog-content\") pod \"community-operators-6n9mz\" (UID: \"944cfefd-484c-48e3-9f72-0ee184944d34\") " pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.025134 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9957c\" (UniqueName: \"kubernetes.io/projected/944cfefd-484c-48e3-9f72-0ee184944d34-kube-api-access-9957c\") pod \"community-operators-6n9mz\" (UID: \"944cfefd-484c-48e3-9f72-0ee184944d34\") " pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.025420 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944cfefd-484c-48e3-9f72-0ee184944d34-utilities\") pod \"community-operators-6n9mz\" (UID: \"944cfefd-484c-48e3-9f72-0ee184944d34\") " pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.041881 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6n9mz"] Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.126533 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944cfefd-484c-48e3-9f72-0ee184944d34-catalog-content\") pod \"community-operators-6n9mz\" (UID: \"944cfefd-484c-48e3-9f72-0ee184944d34\") " pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.126657 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9957c\" (UniqueName: \"kubernetes.io/projected/944cfefd-484c-48e3-9f72-0ee184944d34-kube-api-access-9957c\") pod \"community-operators-6n9mz\" (UID: \"944cfefd-484c-48e3-9f72-0ee184944d34\") " pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.126798 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944cfefd-484c-48e3-9f72-0ee184944d34-utilities\") pod \"community-operators-6n9mz\" (UID: \"944cfefd-484c-48e3-9f72-0ee184944d34\") " pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.127416 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944cfefd-484c-48e3-9f72-0ee184944d34-catalog-content\") pod \"community-operators-6n9mz\" (UID: \"944cfefd-484c-48e3-9f72-0ee184944d34\") " pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.127714 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944cfefd-484c-48e3-9f72-0ee184944d34-utilities\") pod \"community-operators-6n9mz\" (UID: \"944cfefd-484c-48e3-9f72-0ee184944d34\") " pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.165512 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9957c\" (UniqueName: \"kubernetes.io/projected/944cfefd-484c-48e3-9f72-0ee184944d34-kube-api-access-9957c\") pod \"community-operators-6n9mz\" (UID: \"944cfefd-484c-48e3-9f72-0ee184944d34\") " pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.217178 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.235744 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.235795 4585 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="cf58475869fcc5c135583d246a4aa500d5df951c911c78a96443a18fc62912e2" exitCode=137 Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.235825 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"cf58475869fcc5c135583d246a4aa500d5df951c911c78a96443a18fc62912e2"} Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.235859 4585 scope.go:117] "RemoveContainer" containerID="ac11e112aa6831452dc7312d3e7dc8ce804a94234318d30d5515fb7a68cb7de4" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.367620 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:10:56 crc kubenswrapper[4585]: I0215 17:10:56.559551 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6n9mz"] Feb 15 17:10:56 crc kubenswrapper[4585]: W0215 17:10:56.569716 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944cfefd_484c_48e3_9f72_0ee184944d34.slice/crio-3e9994bfd98dd26c3b34120a4d9e6a145e4f544bea23786c08871565680e3d74 WatchSource:0}: Error finding container 3e9994bfd98dd26c3b34120a4d9e6a145e4f544bea23786c08871565680e3d74: Status 404 returned error can't find the container with id 3e9994bfd98dd26c3b34120a4d9e6a145e4f544bea23786c08871565680e3d74 Feb 15 17:10:57 crc kubenswrapper[4585]: I0215 17:10:57.245947 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 15 17:10:57 crc kubenswrapper[4585]: I0215 17:10:57.247996 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"966b4559a8ce366e460f2745bc274a24b5f4c0c411ee56f36162186144c0056b"} Feb 15 17:10:57 crc kubenswrapper[4585]: I0215 17:10:57.250269 4585 generic.go:334] "Generic (PLEG): container finished" podID="944cfefd-484c-48e3-9f72-0ee184944d34" containerID="8d5a06015d9a2757536bfc21f3b841e374bc673904a5ee0ae2c7d86a2654b696" exitCode=0 Feb 15 17:10:57 crc kubenswrapper[4585]: I0215 17:10:57.250324 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n9mz" event={"ID":"944cfefd-484c-48e3-9f72-0ee184944d34","Type":"ContainerDied","Data":"8d5a06015d9a2757536bfc21f3b841e374bc673904a5ee0ae2c7d86a2654b696"} Feb 15 17:10:57 crc kubenswrapper[4585]: I0215 17:10:57.250361 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n9mz" event={"ID":"944cfefd-484c-48e3-9f72-0ee184944d34","Type":"ContainerStarted","Data":"3e9994bfd98dd26c3b34120a4d9e6a145e4f544bea23786c08871565680e3d74"} Feb 15 17:10:58 crc kubenswrapper[4585]: I0215 17:10:58.269101 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n9mz" event={"ID":"944cfefd-484c-48e3-9f72-0ee184944d34","Type":"ContainerStarted","Data":"565f0263bc6d3a5a8218cda250e3557b5d11b03e16b46654a9bfc0585eecc309"} Feb 15 17:10:58 crc kubenswrapper[4585]: E0215 17:10:58.416109 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944cfefd_484c_48e3_9f72_0ee184944d34.slice/crio-565f0263bc6d3a5a8218cda250e3557b5d11b03e16b46654a9bfc0585eecc309.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944cfefd_484c_48e3_9f72_0ee184944d34.slice/crio-conmon-565f0263bc6d3a5a8218cda250e3557b5d11b03e16b46654a9bfc0585eecc309.scope\": RecentStats: unable to find data in memory cache]" Feb 15 17:10:59 crc kubenswrapper[4585]: I0215 17:10:59.277208 4585 generic.go:334] "Generic (PLEG): container finished" podID="944cfefd-484c-48e3-9f72-0ee184944d34" containerID="565f0263bc6d3a5a8218cda250e3557b5d11b03e16b46654a9bfc0585eecc309" exitCode=0 Feb 15 17:10:59 crc kubenswrapper[4585]: I0215 17:10:59.277294 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n9mz" event={"ID":"944cfefd-484c-48e3-9f72-0ee184944d34","Type":"ContainerDied","Data":"565f0263bc6d3a5a8218cda250e3557b5d11b03e16b46654a9bfc0585eecc309"} Feb 15 17:11:00 crc kubenswrapper[4585]: I0215 17:11:00.286755 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n9mz" event={"ID":"944cfefd-484c-48e3-9f72-0ee184944d34","Type":"ContainerStarted","Data":"dd2f485638d39cb2f19a69d727abc207c7bf86875d3b6834c84e09280f0fddf6"} Feb 15 17:11:00 crc kubenswrapper[4585]: I0215 17:11:00.319079 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6n9mz" podStartSLOduration=2.892743638 podStartE2EDuration="5.319052109s" podCreationTimestamp="2026-02-15 17:10:55 +0000 UTC" firstStartedPulling="2026-02-15 17:10:57.25221756 +0000 UTC m=+313.195625732" lastFinishedPulling="2026-02-15 17:10:59.678526031 +0000 UTC m=+315.621934203" observedRunningTime="2026-02-15 17:11:00.314306844 +0000 UTC m=+316.257714986" watchObservedRunningTime="2026-02-15 17:11:00.319052109 +0000 UTC m=+316.262460281" Feb 15 17:11:02 crc kubenswrapper[4585]: I0215 17:11:02.272754 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:11:05 crc kubenswrapper[4585]: I0215 17:11:05.593132 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:11:05 crc kubenswrapper[4585]: I0215 17:11:05.600387 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:11:06 crc kubenswrapper[4585]: I0215 17:11:06.334951 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 15 17:11:06 crc kubenswrapper[4585]: I0215 17:11:06.368745 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:11:06 crc kubenswrapper[4585]: I0215 17:11:06.369867 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:11:06 crc kubenswrapper[4585]: I0215 17:11:06.434838 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:11:07 crc kubenswrapper[4585]: I0215 17:11:07.420514 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6n9mz" Feb 15 17:11:08 crc kubenswrapper[4585]: I0215 17:11:08.957803 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tcrgd"] Feb 15 17:11:08 crc kubenswrapper[4585]: I0215 17:11:08.960003 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:08 crc kubenswrapper[4585]: I0215 17:11:08.974017 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tcrgd"] Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.115142 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e06966bc-b5b4-4b56-b7e0-ecd065633b99-catalog-content\") pod \"community-operators-tcrgd\" (UID: \"e06966bc-b5b4-4b56-b7e0-ecd065633b99\") " pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.115204 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e06966bc-b5b4-4b56-b7e0-ecd065633b99-utilities\") pod \"community-operators-tcrgd\" (UID: \"e06966bc-b5b4-4b56-b7e0-ecd065633b99\") " pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.118635 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkvn\" (UniqueName: \"kubernetes.io/projected/e06966bc-b5b4-4b56-b7e0-ecd065633b99-kube-api-access-5pkvn\") pod \"community-operators-tcrgd\" (UID: \"e06966bc-b5b4-4b56-b7e0-ecd065633b99\") " pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.220064 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e06966bc-b5b4-4b56-b7e0-ecd065633b99-utilities\") pod \"community-operators-tcrgd\" (UID: \"e06966bc-b5b4-4b56-b7e0-ecd065633b99\") " pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.220184 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkvn\" (UniqueName: \"kubernetes.io/projected/e06966bc-b5b4-4b56-b7e0-ecd065633b99-kube-api-access-5pkvn\") pod \"community-operators-tcrgd\" (UID: \"e06966bc-b5b4-4b56-b7e0-ecd065633b99\") " pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.220419 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e06966bc-b5b4-4b56-b7e0-ecd065633b99-catalog-content\") pod \"community-operators-tcrgd\" (UID: \"e06966bc-b5b4-4b56-b7e0-ecd065633b99\") " pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.221053 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e06966bc-b5b4-4b56-b7e0-ecd065633b99-utilities\") pod \"community-operators-tcrgd\" (UID: \"e06966bc-b5b4-4b56-b7e0-ecd065633b99\") " pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.221211 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e06966bc-b5b4-4b56-b7e0-ecd065633b99-catalog-content\") pod \"community-operators-tcrgd\" (UID: \"e06966bc-b5b4-4b56-b7e0-ecd065633b99\") " pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.251524 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkvn\" (UniqueName: \"kubernetes.io/projected/e06966bc-b5b4-4b56-b7e0-ecd065633b99-kube-api-access-5pkvn\") pod \"community-operators-tcrgd\" (UID: \"e06966bc-b5b4-4b56-b7e0-ecd065633b99\") " pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.297222 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:09 crc kubenswrapper[4585]: I0215 17:11:09.757515 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tcrgd"] Feb 15 17:11:10 crc kubenswrapper[4585]: I0215 17:11:10.355293 4585 generic.go:334] "Generic (PLEG): container finished" podID="e06966bc-b5b4-4b56-b7e0-ecd065633b99" containerID="3b82ba5761cd8d322cf016882fe051c9bfbf34a56abbbe5e36050e70682ac2d4" exitCode=0 Feb 15 17:11:10 crc kubenswrapper[4585]: I0215 17:11:10.355376 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcrgd" event={"ID":"e06966bc-b5b4-4b56-b7e0-ecd065633b99","Type":"ContainerDied","Data":"3b82ba5761cd8d322cf016882fe051c9bfbf34a56abbbe5e36050e70682ac2d4"} Feb 15 17:11:10 crc kubenswrapper[4585]: I0215 17:11:10.355567 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcrgd" event={"ID":"e06966bc-b5b4-4b56-b7e0-ecd065633b99","Type":"ContainerStarted","Data":"00392983a80c76834ed9265880878d46af53c53aba5061fa133018ae9dd99a9d"} Feb 15 17:11:11 crc kubenswrapper[4585]: I0215 17:11:11.366681 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcrgd" event={"ID":"e06966bc-b5b4-4b56-b7e0-ecd065633b99","Type":"ContainerStarted","Data":"755207f61b8f60c2dfc83726f0cb672c46e860da82b62dc30c59f53c825c72d4"} Feb 15 17:11:11 crc kubenswrapper[4585]: I0215 17:11:11.951161 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p2lqh"] Feb 15 17:11:11 crc kubenswrapper[4585]: I0215 17:11:11.953192 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:11 crc kubenswrapper[4585]: I0215 17:11:11.955827 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996d261a-6fd6-44a8-bb7b-78e9f2024a05-utilities\") pod \"community-operators-p2lqh\" (UID: \"996d261a-6fd6-44a8-bb7b-78e9f2024a05\") " pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:11 crc kubenswrapper[4585]: I0215 17:11:11.955887 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996d261a-6fd6-44a8-bb7b-78e9f2024a05-catalog-content\") pod \"community-operators-p2lqh\" (UID: \"996d261a-6fd6-44a8-bb7b-78e9f2024a05\") " pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:11 crc kubenswrapper[4585]: I0215 17:11:11.955936 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79zfp\" (UniqueName: \"kubernetes.io/projected/996d261a-6fd6-44a8-bb7b-78e9f2024a05-kube-api-access-79zfp\") pod \"community-operators-p2lqh\" (UID: \"996d261a-6fd6-44a8-bb7b-78e9f2024a05\") " pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:11 crc kubenswrapper[4585]: I0215 17:11:11.967914 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p2lqh"] Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.056905 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996d261a-6fd6-44a8-bb7b-78e9f2024a05-utilities\") pod \"community-operators-p2lqh\" (UID: \"996d261a-6fd6-44a8-bb7b-78e9f2024a05\") " pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.056964 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996d261a-6fd6-44a8-bb7b-78e9f2024a05-catalog-content\") pod \"community-operators-p2lqh\" (UID: \"996d261a-6fd6-44a8-bb7b-78e9f2024a05\") " pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.057011 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79zfp\" (UniqueName: \"kubernetes.io/projected/996d261a-6fd6-44a8-bb7b-78e9f2024a05-kube-api-access-79zfp\") pod \"community-operators-p2lqh\" (UID: \"996d261a-6fd6-44a8-bb7b-78e9f2024a05\") " pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.057941 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996d261a-6fd6-44a8-bb7b-78e9f2024a05-catalog-content\") pod \"community-operators-p2lqh\" (UID: \"996d261a-6fd6-44a8-bb7b-78e9f2024a05\") " pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.057989 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996d261a-6fd6-44a8-bb7b-78e9f2024a05-utilities\") pod \"community-operators-p2lqh\" (UID: \"996d261a-6fd6-44a8-bb7b-78e9f2024a05\") " pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.092751 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79zfp\" (UniqueName: \"kubernetes.io/projected/996d261a-6fd6-44a8-bb7b-78e9f2024a05-kube-api-access-79zfp\") pod \"community-operators-p2lqh\" (UID: \"996d261a-6fd6-44a8-bb7b-78e9f2024a05\") " pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.293280 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.374684 4585 generic.go:334] "Generic (PLEG): container finished" podID="e06966bc-b5b4-4b56-b7e0-ecd065633b99" containerID="755207f61b8f60c2dfc83726f0cb672c46e860da82b62dc30c59f53c825c72d4" exitCode=0 Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.374738 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcrgd" event={"ID":"e06966bc-b5b4-4b56-b7e0-ecd065633b99","Type":"ContainerDied","Data":"755207f61b8f60c2dfc83726f0cb672c46e860da82b62dc30c59f53c825c72d4"} Feb 15 17:11:12 crc kubenswrapper[4585]: I0215 17:11:12.765081 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p2lqh"] Feb 15 17:11:12 crc kubenswrapper[4585]: W0215 17:11:12.784321 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996d261a_6fd6_44a8_bb7b_78e9f2024a05.slice/crio-04800ac208aa44799da35b1a44fd4bc6761c3bd184565dcd6417f00387968ef5 WatchSource:0}: Error finding container 04800ac208aa44799da35b1a44fd4bc6761c3bd184565dcd6417f00387968ef5: Status 404 returned error can't find the container with id 04800ac208aa44799da35b1a44fd4bc6761c3bd184565dcd6417f00387968ef5 Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.350616 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fcdtv"] Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.351787 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.379668 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e442ed18-9718-4f0d-b548-4769856b3b5d-catalog-content\") pod \"community-operators-fcdtv\" (UID: \"e442ed18-9718-4f0d-b548-4769856b3b5d\") " pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.379739 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e442ed18-9718-4f0d-b548-4769856b3b5d-utilities\") pod \"community-operators-fcdtv\" (UID: \"e442ed18-9718-4f0d-b548-4769856b3b5d\") " pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.379771 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdc68\" (UniqueName: \"kubernetes.io/projected/e442ed18-9718-4f0d-b548-4769856b3b5d-kube-api-access-qdc68\") pod \"community-operators-fcdtv\" (UID: \"e442ed18-9718-4f0d-b548-4769856b3b5d\") " pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.379834 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcdtv"] Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.383901 4585 generic.go:334] "Generic (PLEG): container finished" podID="996d261a-6fd6-44a8-bb7b-78e9f2024a05" containerID="722723069c759b0aa5c9618978d0e5c048c8b953ba162993d15bbe2b5ffb75c8" exitCode=0 Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.383996 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lqh" event={"ID":"996d261a-6fd6-44a8-bb7b-78e9f2024a05","Type":"ContainerDied","Data":"722723069c759b0aa5c9618978d0e5c048c8b953ba162993d15bbe2b5ffb75c8"} Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.384030 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lqh" event={"ID":"996d261a-6fd6-44a8-bb7b-78e9f2024a05","Type":"ContainerStarted","Data":"04800ac208aa44799da35b1a44fd4bc6761c3bd184565dcd6417f00387968ef5"} Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.387112 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tcrgd" event={"ID":"e06966bc-b5b4-4b56-b7e0-ecd065633b99","Type":"ContainerStarted","Data":"391dd8fa3801e2fe8500afb2128f24c430b53b8e447e7cac3eb77cb638e1df09"} Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.433838 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tcrgd" podStartSLOduration=3.001298146 podStartE2EDuration="5.433820358s" podCreationTimestamp="2026-02-15 17:11:08 +0000 UTC" firstStartedPulling="2026-02-15 17:11:10.357998234 +0000 UTC m=+326.301406376" lastFinishedPulling="2026-02-15 17:11:12.790520456 +0000 UTC m=+328.733928588" observedRunningTime="2026-02-15 17:11:13.425755156 +0000 UTC m=+329.369163308" watchObservedRunningTime="2026-02-15 17:11:13.433820358 +0000 UTC m=+329.377228490" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.480735 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e442ed18-9718-4f0d-b548-4769856b3b5d-catalog-content\") pod \"community-operators-fcdtv\" (UID: \"e442ed18-9718-4f0d-b548-4769856b3b5d\") " pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.481087 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e442ed18-9718-4f0d-b548-4769856b3b5d-utilities\") pod \"community-operators-fcdtv\" (UID: \"e442ed18-9718-4f0d-b548-4769856b3b5d\") " pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.481226 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdc68\" (UniqueName: \"kubernetes.io/projected/e442ed18-9718-4f0d-b548-4769856b3b5d-kube-api-access-qdc68\") pod \"community-operators-fcdtv\" (UID: \"e442ed18-9718-4f0d-b548-4769856b3b5d\") " pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.481560 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e442ed18-9718-4f0d-b548-4769856b3b5d-catalog-content\") pod \"community-operators-fcdtv\" (UID: \"e442ed18-9718-4f0d-b548-4769856b3b5d\") " pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.481586 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e442ed18-9718-4f0d-b548-4769856b3b5d-utilities\") pod \"community-operators-fcdtv\" (UID: \"e442ed18-9718-4f0d-b548-4769856b3b5d\") " pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.514748 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdc68\" (UniqueName: \"kubernetes.io/projected/e442ed18-9718-4f0d-b548-4769856b3b5d-kube-api-access-qdc68\") pod \"community-operators-fcdtv\" (UID: \"e442ed18-9718-4f0d-b548-4769856b3b5d\") " pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.679217 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.861180 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8"] Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.861809 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" podUID="d9c426cb-d8ae-4150-adb2-327d42b7df5b" containerName="route-controller-manager" containerID="cri-o://b72af6a523a88d6dc2a537455e69d7a896b5491f2b1da4b72bc772dc97ccf591" gracePeriod=30 Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.864057 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-69rgc"] Feb 15 17:11:13 crc kubenswrapper[4585]: I0215 17:11:13.864375 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" podUID="fe40fba3-3513-4a78-905a-15ffd6b2f8b2" containerName="controller-manager" containerID="cri-o://a23241966c44a5a88cc68d83613790baa563b81ddf7f94457d7dd8e50f0aa887" gracePeriod=30 Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.024478 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcdtv"] Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.392272 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lqh" event={"ID":"996d261a-6fd6-44a8-bb7b-78e9f2024a05","Type":"ContainerStarted","Data":"9d9674b506cd95d512fcc7b196c7b177f49d692513404ef57e3914aea770cb31"} Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.403133 4585 generic.go:334] "Generic (PLEG): container finished" podID="e442ed18-9718-4f0d-b548-4769856b3b5d" containerID="42a34b086ee51c11d3072093ad68fae30cd0f6724e90fcdf29e63d1b5e4239db" exitCode=0 Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.403223 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdtv" event={"ID":"e442ed18-9718-4f0d-b548-4769856b3b5d","Type":"ContainerDied","Data":"42a34b086ee51c11d3072093ad68fae30cd0f6724e90fcdf29e63d1b5e4239db"} Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.403263 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdtv" event={"ID":"e442ed18-9718-4f0d-b548-4769856b3b5d","Type":"ContainerStarted","Data":"b54ca6f9c1a222cc98b28ab54589c11cde38dd3c49fd7d7b6c0f22dbda554da8"} Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.405729 4585 generic.go:334] "Generic (PLEG): container finished" podID="d9c426cb-d8ae-4150-adb2-327d42b7df5b" containerID="b72af6a523a88d6dc2a537455e69d7a896b5491f2b1da4b72bc772dc97ccf591" exitCode=0 Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.405802 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" event={"ID":"d9c426cb-d8ae-4150-adb2-327d42b7df5b","Type":"ContainerDied","Data":"b72af6a523a88d6dc2a537455e69d7a896b5491f2b1da4b72bc772dc97ccf591"} Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.405827 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" event={"ID":"d9c426cb-d8ae-4150-adb2-327d42b7df5b","Type":"ContainerDied","Data":"5237edb6f5dddf5c4c6340bd30e5f0304b9991ce15682feb9e75020bb611a4a5"} Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.405839 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5237edb6f5dddf5c4c6340bd30e5f0304b9991ce15682feb9e75020bb611a4a5" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.407777 4585 generic.go:334] "Generic (PLEG): container finished" podID="fe40fba3-3513-4a78-905a-15ffd6b2f8b2" containerID="a23241966c44a5a88cc68d83613790baa563b81ddf7f94457d7dd8e50f0aa887" exitCode=0 Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.407831 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" event={"ID":"fe40fba3-3513-4a78-905a-15ffd6b2f8b2","Type":"ContainerDied","Data":"a23241966c44a5a88cc68d83613790baa563b81ddf7f94457d7dd8e50f0aa887"} Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.418791 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.425902 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.548874 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r4fvv"] Feb 15 17:11:14 crc kubenswrapper[4585]: E0215 17:11:14.549336 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c426cb-d8ae-4150-adb2-327d42b7df5b" containerName="route-controller-manager" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.549348 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c426cb-d8ae-4150-adb2-327d42b7df5b" containerName="route-controller-manager" Feb 15 17:11:14 crc kubenswrapper[4585]: E0215 17:11:14.549358 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe40fba3-3513-4a78-905a-15ffd6b2f8b2" containerName="controller-manager" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.549364 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe40fba3-3513-4a78-905a-15ffd6b2f8b2" containerName="controller-manager" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.549452 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c426cb-d8ae-4150-adb2-327d42b7df5b" containerName="route-controller-manager" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.549465 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe40fba3-3513-4a78-905a-15ffd6b2f8b2" containerName="controller-manager" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.550147 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.568249 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4fvv"] Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.600065 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9c426cb-d8ae-4150-adb2-327d42b7df5b-serving-cert\") pod \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.600147 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-client-ca\") pod \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.600188 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-252xr\" (UniqueName: \"kubernetes.io/projected/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-kube-api-access-252xr\") pod \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.600224 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-config\") pod \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.600252 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-client-ca\") pod \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.600293 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p84r\" (UniqueName: \"kubernetes.io/projected/d9c426cb-d8ae-4150-adb2-327d42b7df5b-kube-api-access-7p84r\") pod \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.600459 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-serving-cert\") pod \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.600490 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-config\") pod \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\" (UID: \"d9c426cb-d8ae-4150-adb2-327d42b7df5b\") " Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.600514 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-proxy-ca-bundles\") pod \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\" (UID: \"fe40fba3-3513-4a78-905a-15ffd6b2f8b2\") " Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.601015 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-client-ca" (OuterVolumeSpecName: "client-ca") pod "fe40fba3-3513-4a78-905a-15ffd6b2f8b2" (UID: "fe40fba3-3513-4a78-905a-15ffd6b2f8b2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.601102 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-config" (OuterVolumeSpecName: "config") pod "fe40fba3-3513-4a78-905a-15ffd6b2f8b2" (UID: "fe40fba3-3513-4a78-905a-15ffd6b2f8b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.601386 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fe40fba3-3513-4a78-905a-15ffd6b2f8b2" (UID: "fe40fba3-3513-4a78-905a-15ffd6b2f8b2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.601471 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-client-ca" (OuterVolumeSpecName: "client-ca") pod "d9c426cb-d8ae-4150-adb2-327d42b7df5b" (UID: "d9c426cb-d8ae-4150-adb2-327d42b7df5b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.601570 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-config" (OuterVolumeSpecName: "config") pod "d9c426cb-d8ae-4150-adb2-327d42b7df5b" (UID: "d9c426cb-d8ae-4150-adb2-327d42b7df5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.606666 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9c426cb-d8ae-4150-adb2-327d42b7df5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d9c426cb-d8ae-4150-adb2-327d42b7df5b" (UID: "d9c426cb-d8ae-4150-adb2-327d42b7df5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.606734 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fe40fba3-3513-4a78-905a-15ffd6b2f8b2" (UID: "fe40fba3-3513-4a78-905a-15ffd6b2f8b2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.607084 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-kube-api-access-252xr" (OuterVolumeSpecName: "kube-api-access-252xr") pod "fe40fba3-3513-4a78-905a-15ffd6b2f8b2" (UID: "fe40fba3-3513-4a78-905a-15ffd6b2f8b2"). InnerVolumeSpecName "kube-api-access-252xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.607961 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c426cb-d8ae-4150-adb2-327d42b7df5b-kube-api-access-7p84r" (OuterVolumeSpecName: "kube-api-access-7p84r") pod "d9c426cb-d8ae-4150-adb2-327d42b7df5b" (UID: "d9c426cb-d8ae-4150-adb2-327d42b7df5b"). InnerVolumeSpecName "kube-api-access-7p84r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.701982 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqd8\" (UniqueName: \"kubernetes.io/projected/6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b-kube-api-access-pfqd8\") pod \"community-operators-r4fvv\" (UID: \"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b\") " pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702076 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b-utilities\") pod \"community-operators-r4fvv\" (UID: \"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b\") " pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702094 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b-catalog-content\") pod \"community-operators-r4fvv\" (UID: \"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b\") " pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702131 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702142 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702150 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702161 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9c426cb-d8ae-4150-adb2-327d42b7df5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702168 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c426cb-d8ae-4150-adb2-327d42b7df5b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702178 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-252xr\" (UniqueName: \"kubernetes.io/projected/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-kube-api-access-252xr\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702185 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702192 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe40fba3-3513-4a78-905a-15ffd6b2f8b2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:14 crc kubenswrapper[4585]: I0215 17:11:14.702200 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p84r\" (UniqueName: \"kubernetes.io/projected/d9c426cb-d8ae-4150-adb2-327d42b7df5b-kube-api-access-7p84r\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:14.805020 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqd8\" (UniqueName: \"kubernetes.io/projected/6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b-kube-api-access-pfqd8\") pod \"community-operators-r4fvv\" (UID: \"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b\") " pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:14.805137 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b-utilities\") pod \"community-operators-r4fvv\" (UID: \"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b\") " pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:14.805165 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b-catalog-content\") pod \"community-operators-r4fvv\" (UID: \"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b\") " pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:14.805706 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b-catalog-content\") pod \"community-operators-r4fvv\" (UID: \"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b\") " pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:14.805782 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b-utilities\") pod \"community-operators-r4fvv\" (UID: \"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b\") " pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:14.832697 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqd8\" (UniqueName: \"kubernetes.io/projected/6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b-kube-api-access-pfqd8\") pod \"community-operators-r4fvv\" (UID: \"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b\") " pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:14.862618 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.419449 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" event={"ID":"fe40fba3-3513-4a78-905a-15ffd6b2f8b2","Type":"ContainerDied","Data":"fe34f6a6c6dde9e11fc4d9b275bc4440c877503edf38f80abcb2786ef64c6dc4"} Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.420132 4585 scope.go:117] "RemoveContainer" containerID="a23241966c44a5a88cc68d83613790baa563b81ddf7f94457d7dd8e50f0aa887" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.420242 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-69rgc" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.427164 4585 generic.go:334] "Generic (PLEG): container finished" podID="996d261a-6fd6-44a8-bb7b-78e9f2024a05" containerID="9d9674b506cd95d512fcc7b196c7b177f49d692513404ef57e3914aea770cb31" exitCode=0 Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.427242 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lqh" event={"ID":"996d261a-6fd6-44a8-bb7b-78e9f2024a05","Type":"ContainerDied","Data":"9d9674b506cd95d512fcc7b196c7b177f49d692513404ef57e3914aea770cb31"} Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.444370 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.444706 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdtv" event={"ID":"e442ed18-9718-4f0d-b548-4769856b3b5d","Type":"ContainerStarted","Data":"35cc983b978d5f10978fb9762fd2efb523509c71bda43d7249b138ab1176fa23"} Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.493640 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-69rgc"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.506700 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-69rgc"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.521739 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.525290 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6jtm8"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.709344 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4fvv"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.745500 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-flrdq"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.746430 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.761715 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flrdq"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.835350 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2324ac96-77b5-46b6-a109-22380ef0475d-catalog-content\") pod \"community-operators-flrdq\" (UID: \"2324ac96-77b5-46b6-a109-22380ef0475d\") " pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.835796 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2324ac96-77b5-46b6-a109-22380ef0475d-utilities\") pod \"community-operators-flrdq\" (UID: \"2324ac96-77b5-46b6-a109-22380ef0475d\") " pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.835867 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzhsg\" (UniqueName: \"kubernetes.io/projected/2324ac96-77b5-46b6-a109-22380ef0475d-kube-api-access-hzhsg\") pod \"community-operators-flrdq\" (UID: \"2324ac96-77b5-46b6-a109-22380ef0475d\") " pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.929417 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.930148 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.932679 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.933276 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64654fc454-wclbx"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.933563 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.933668 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.933802 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.934389 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.936106 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.936741 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.937334 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2324ac96-77b5-46b6-a109-22380ef0475d-utilities\") pod \"community-operators-flrdq\" (UID: \"2324ac96-77b5-46b6-a109-22380ef0475d\") " pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.937419 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzhsg\" (UniqueName: \"kubernetes.io/projected/2324ac96-77b5-46b6-a109-22380ef0475d-kube-api-access-hzhsg\") pod \"community-operators-flrdq\" (UID: \"2324ac96-77b5-46b6-a109-22380ef0475d\") " pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.937460 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2324ac96-77b5-46b6-a109-22380ef0475d-catalog-content\") pod \"community-operators-flrdq\" (UID: \"2324ac96-77b5-46b6-a109-22380ef0475d\") " pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.937944 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2324ac96-77b5-46b6-a109-22380ef0475d-catalog-content\") pod \"community-operators-flrdq\" (UID: \"2324ac96-77b5-46b6-a109-22380ef0475d\") " pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.938627 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2324ac96-77b5-46b6-a109-22380ef0475d-utilities\") pod \"community-operators-flrdq\" (UID: \"2324ac96-77b5-46b6-a109-22380ef0475d\") " pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.943962 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.944097 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.947288 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64654fc454-wclbx"] Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.952312 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.952365 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.952626 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.953633 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.953701 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.966179 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzhsg\" (UniqueName: \"kubernetes.io/projected/2324ac96-77b5-46b6-a109-22380ef0475d-kube-api-access-hzhsg\") pod \"community-operators-flrdq\" (UID: \"2324ac96-77b5-46b6-a109-22380ef0475d\") " pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:15 crc kubenswrapper[4585]: I0215 17:11:15.967086 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.039409 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-config\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.039498 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-client-ca\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.039533 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-client-ca\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.039558 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-proxy-ca-bundles\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.039762 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-serving-cert\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.039811 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlkbv\" (UniqueName: \"kubernetes.io/projected/b74d4122-7aa3-4eed-9991-4c19067b911e-kube-api-access-hlkbv\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.039868 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-config\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.039945 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrjjf\" (UniqueName: \"kubernetes.io/projected/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-kube-api-access-zrjjf\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.039976 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b74d4122-7aa3-4eed-9991-4c19067b911e-serving-cert\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.111827 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.143411 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrjjf\" (UniqueName: \"kubernetes.io/projected/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-kube-api-access-zrjjf\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.143476 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b74d4122-7aa3-4eed-9991-4c19067b911e-serving-cert\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.143511 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-config\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.143568 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-client-ca\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.143592 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-client-ca\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.143638 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-proxy-ca-bundles\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.143663 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-serving-cert\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.143690 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlkbv\" (UniqueName: \"kubernetes.io/projected/b74d4122-7aa3-4eed-9991-4c19067b911e-kube-api-access-hlkbv\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.143735 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-config\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.145073 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-config\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.160523 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b74d4122-7aa3-4eed-9991-4c19067b911e-serving-cert\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.162145 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-config\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.163050 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-client-ca\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.169169 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-client-ca\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.178230 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-proxy-ca-bundles\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.180377 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrjjf\" (UniqueName: \"kubernetes.io/projected/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-kube-api-access-zrjjf\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.189834 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef425efb-a5ab-4d80-b702-4f1d16a1acfd-serving-cert\") pod \"route-controller-manager-978fc87b6-h5vb8\" (UID: \"ef425efb-a5ab-4d80-b702-4f1d16a1acfd\") " pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.201493 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlkbv\" (UniqueName: \"kubernetes.io/projected/b74d4122-7aa3-4eed-9991-4c19067b911e-kube-api-access-hlkbv\") pod \"controller-manager-64654fc454-wclbx\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.257150 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.875894 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.907448 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c426cb-d8ae-4150-adb2-327d42b7df5b" path="/var/lib/kubelet/pods/d9c426cb-d8ae-4150-adb2-327d42b7df5b/volumes" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.908777 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe40fba3-3513-4a78-905a-15ffd6b2f8b2" path="/var/lib/kubelet/pods/fe40fba3-3513-4a78-905a-15ffd6b2f8b2/volumes" Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.914765 4585 generic.go:334] "Generic (PLEG): container finished" podID="6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b" containerID="786bf14e0d9df6f0b5c4d92304cb64b71c2b43fa938ee77a695b35ca35649df4" exitCode=0 Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.914834 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fvv" event={"ID":"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b","Type":"ContainerDied","Data":"786bf14e0d9df6f0b5c4d92304cb64b71c2b43fa938ee77a695b35ca35649df4"} Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.914883 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fvv" event={"ID":"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b","Type":"ContainerStarted","Data":"e2cbfd44b1984493a5739c1f05bf4ebb45c613bbc7a7a3d9cea4dac686e9b108"} Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.920440 4585 generic.go:334] "Generic (PLEG): container finished" podID="e442ed18-9718-4f0d-b548-4769856b3b5d" containerID="35cc983b978d5f10978fb9762fd2efb523509c71bda43d7249b138ab1176fa23" exitCode=0 Feb 15 17:11:16 crc kubenswrapper[4585]: I0215 17:11:16.920466 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdtv" event={"ID":"e442ed18-9718-4f0d-b548-4769856b3b5d","Type":"ContainerDied","Data":"35cc983b978d5f10978fb9762fd2efb523509c71bda43d7249b138ab1176fa23"} Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.087640 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.087781 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.162261 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9nwr"] Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.163765 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.185476 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9nwr"] Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.294798 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxxvz\" (UniqueName: \"kubernetes.io/projected/ec374614-bada-412d-a649-dc86e3ddaa43-kube-api-access-pxxvz\") pod \"community-operators-f9nwr\" (UID: \"ec374614-bada-412d-a649-dc86e3ddaa43\") " pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.294834 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec374614-bada-412d-a649-dc86e3ddaa43-utilities\") pod \"community-operators-f9nwr\" (UID: \"ec374614-bada-412d-a649-dc86e3ddaa43\") " pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.294901 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec374614-bada-412d-a649-dc86e3ddaa43-catalog-content\") pod \"community-operators-f9nwr\" (UID: \"ec374614-bada-412d-a649-dc86e3ddaa43\") " pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.339027 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flrdq"] Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.399550 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxxvz\" (UniqueName: \"kubernetes.io/projected/ec374614-bada-412d-a649-dc86e3ddaa43-kube-api-access-pxxvz\") pod \"community-operators-f9nwr\" (UID: \"ec374614-bada-412d-a649-dc86e3ddaa43\") " pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.401394 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec374614-bada-412d-a649-dc86e3ddaa43-utilities\") pod \"community-operators-f9nwr\" (UID: \"ec374614-bada-412d-a649-dc86e3ddaa43\") " pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.401516 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec374614-bada-412d-a649-dc86e3ddaa43-catalog-content\") pod \"community-operators-f9nwr\" (UID: \"ec374614-bada-412d-a649-dc86e3ddaa43\") " pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.402713 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec374614-bada-412d-a649-dc86e3ddaa43-utilities\") pod \"community-operators-f9nwr\" (UID: \"ec374614-bada-412d-a649-dc86e3ddaa43\") " pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.402867 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec374614-bada-412d-a649-dc86e3ddaa43-catalog-content\") pod \"community-operators-f9nwr\" (UID: \"ec374614-bada-412d-a649-dc86e3ddaa43\") " pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.424665 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64654fc454-wclbx"] Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.439177 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxxvz\" (UniqueName: \"kubernetes.io/projected/ec374614-bada-412d-a649-dc86e3ddaa43-kube-api-access-pxxvz\") pod \"community-operators-f9nwr\" (UID: \"ec374614-bada-412d-a649-dc86e3ddaa43\") " pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: W0215 17:11:17.454444 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74d4122_7aa3_4eed_9991_4c19067b911e.slice/crio-f5b0949af51fbcfe740d6e629601b49f469e1d70065f5fb04d2ad056ecff9544 WatchSource:0}: Error finding container f5b0949af51fbcfe740d6e629601b49f469e1d70065f5fb04d2ad056ecff9544: Status 404 returned error can't find the container with id f5b0949af51fbcfe740d6e629601b49f469e1d70065f5fb04d2ad056ecff9544 Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.470506 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8"] Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.486162 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.937242 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2lqh" event={"ID":"996d261a-6fd6-44a8-bb7b-78e9f2024a05","Type":"ContainerStarted","Data":"4a9c8c90b35852ad07720366af0e47a1fc861e3e6504e94c75c2817b96821b01"} Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.939108 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9nwr"] Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.943886 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fvv" event={"ID":"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b","Type":"ContainerStarted","Data":"4ac0f5ada9d3dd34d4d3a76bc0f9448ca5b93d9f5c59d841003f23e712d0820a"} Feb 15 17:11:17 crc kubenswrapper[4585]: W0215 17:11:17.946617 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec374614_bada_412d_a649_dc86e3ddaa43.slice/crio-98fadd08e2bc7f83e5476600673bad1e4a9c220263333fc724bd5a6dd4c1a2b1 WatchSource:0}: Error finding container 98fadd08e2bc7f83e5476600673bad1e4a9c220263333fc724bd5a6dd4c1a2b1: Status 404 returned error can't find the container with id 98fadd08e2bc7f83e5476600673bad1e4a9c220263333fc724bd5a6dd4c1a2b1 Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.949118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcdtv" event={"ID":"e442ed18-9718-4f0d-b548-4769856b3b5d","Type":"ContainerStarted","Data":"e1b21fdcd93026aabed0bd9b7327947977621c09a5b1447db87f65cbdcd45e99"} Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.957063 4585 generic.go:334] "Generic (PLEG): container finished" podID="2324ac96-77b5-46b6-a109-22380ef0475d" containerID="b3d8cf760338a48d3148e184115f47baefa9c21937f3e2c81bedf26466b62a84" exitCode=0 Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.957136 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flrdq" event={"ID":"2324ac96-77b5-46b6-a109-22380ef0475d","Type":"ContainerDied","Data":"b3d8cf760338a48d3148e184115f47baefa9c21937f3e2c81bedf26466b62a84"} Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.957166 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flrdq" event={"ID":"2324ac96-77b5-46b6-a109-22380ef0475d","Type":"ContainerStarted","Data":"3d35a593f6826f74b1bdbe9dff99c66588136a4753227f96b24c5d2eda3f2d7b"} Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.967366 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" event={"ID":"b74d4122-7aa3-4eed-9991-4c19067b911e","Type":"ContainerStarted","Data":"354df0492d1e475974568ba6a55f64271f2cf8f53da08a19b90a0a2d03de2f4c"} Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.967419 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" event={"ID":"b74d4122-7aa3-4eed-9991-4c19067b911e","Type":"ContainerStarted","Data":"f5b0949af51fbcfe740d6e629601b49f469e1d70065f5fb04d2ad056ecff9544"} Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.969195 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.974397 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p2lqh" podStartSLOduration=3.124908769 podStartE2EDuration="6.974372208s" podCreationTimestamp="2026-02-15 17:11:11 +0000 UTC" firstStartedPulling="2026-02-15 17:11:13.386427321 +0000 UTC m=+329.329835463" lastFinishedPulling="2026-02-15 17:11:17.23589077 +0000 UTC m=+333.179298902" observedRunningTime="2026-02-15 17:11:17.96901361 +0000 UTC m=+333.912421732" watchObservedRunningTime="2026-02-15 17:11:17.974372208 +0000 UTC m=+333.917780340" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.988785 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" event={"ID":"ef425efb-a5ab-4d80-b702-4f1d16a1acfd","Type":"ContainerStarted","Data":"90454f0a287b942fb68a01c5063070891cc9572b965ea82c3495c2e7b92f601e"} Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.989176 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" event={"ID":"ef425efb-a5ab-4d80-b702-4f1d16a1acfd","Type":"ContainerStarted","Data":"69f2796fb13e6acc3cccdcf163b70dcb8ebf0a976d021b393a5688000558d94f"} Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.990137 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:17 crc kubenswrapper[4585]: I0215 17:11:17.995958 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fcdtv" podStartSLOduration=1.985059901 podStartE2EDuration="4.995943382s" podCreationTimestamp="2026-02-15 17:11:13 +0000 UTC" firstStartedPulling="2026-02-15 17:11:14.40454587 +0000 UTC m=+330.347954002" lastFinishedPulling="2026-02-15 17:11:17.415429351 +0000 UTC m=+333.358837483" observedRunningTime="2026-02-15 17:11:17.994294877 +0000 UTC m=+333.937703009" watchObservedRunningTime="2026-02-15 17:11:17.995943382 +0000 UTC m=+333.939351514" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.004137 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.083835 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" podStartSLOduration=5.083806926 podStartE2EDuration="5.083806926s" podCreationTimestamp="2026-02-15 17:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:11:18.080679049 +0000 UTC m=+334.024087181" watchObservedRunningTime="2026-02-15 17:11:18.083806926 +0000 UTC m=+334.027215058" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.107894 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" podStartSLOduration=4.107870609 podStartE2EDuration="4.107870609s" podCreationTimestamp="2026-02-15 17:11:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:11:18.103201041 +0000 UTC m=+334.046609163" watchObservedRunningTime="2026-02-15 17:11:18.107870609 +0000 UTC m=+334.051278741" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.267088 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-978fc87b6-h5vb8" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.553901 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q5q42"] Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.555262 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.572985 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q5q42"] Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.738253 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/817eac9e-3482-4350-bc11-f58ec7bad74c-utilities\") pod \"community-operators-q5q42\" (UID: \"817eac9e-3482-4350-bc11-f58ec7bad74c\") " pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.738406 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgbpr\" (UniqueName: \"kubernetes.io/projected/817eac9e-3482-4350-bc11-f58ec7bad74c-kube-api-access-vgbpr\") pod \"community-operators-q5q42\" (UID: \"817eac9e-3482-4350-bc11-f58ec7bad74c\") " pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.738461 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/817eac9e-3482-4350-bc11-f58ec7bad74c-catalog-content\") pod \"community-operators-q5q42\" (UID: \"817eac9e-3482-4350-bc11-f58ec7bad74c\") " pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.839854 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/817eac9e-3482-4350-bc11-f58ec7bad74c-catalog-content\") pod \"community-operators-q5q42\" (UID: \"817eac9e-3482-4350-bc11-f58ec7bad74c\") " pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.840333 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/817eac9e-3482-4350-bc11-f58ec7bad74c-catalog-content\") pod \"community-operators-q5q42\" (UID: \"817eac9e-3482-4350-bc11-f58ec7bad74c\") " pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.840750 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/817eac9e-3482-4350-bc11-f58ec7bad74c-utilities\") pod \"community-operators-q5q42\" (UID: \"817eac9e-3482-4350-bc11-f58ec7bad74c\") " pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.846573 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/817eac9e-3482-4350-bc11-f58ec7bad74c-utilities\") pod \"community-operators-q5q42\" (UID: \"817eac9e-3482-4350-bc11-f58ec7bad74c\") " pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.846689 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgbpr\" (UniqueName: \"kubernetes.io/projected/817eac9e-3482-4350-bc11-f58ec7bad74c-kube-api-access-vgbpr\") pod \"community-operators-q5q42\" (UID: \"817eac9e-3482-4350-bc11-f58ec7bad74c\") " pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.875339 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgbpr\" (UniqueName: \"kubernetes.io/projected/817eac9e-3482-4350-bc11-f58ec7bad74c-kube-api-access-vgbpr\") pod \"community-operators-q5q42\" (UID: \"817eac9e-3482-4350-bc11-f58ec7bad74c\") " pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.998524 4585 generic.go:334] "Generic (PLEG): container finished" podID="6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b" containerID="4ac0f5ada9d3dd34d4d3a76bc0f9448ca5b93d9f5c59d841003f23e712d0820a" exitCode=0 Feb 15 17:11:18 crc kubenswrapper[4585]: I0215 17:11:18.998557 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fvv" event={"ID":"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b","Type":"ContainerDied","Data":"4ac0f5ada9d3dd34d4d3a76bc0f9448ca5b93d9f5c59d841003f23e712d0820a"} Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.000420 4585 generic.go:334] "Generic (PLEG): container finished" podID="ec374614-bada-412d-a649-dc86e3ddaa43" containerID="9ca43a0c328bea27302d4eba19385c4af436c21487cf3066bcbdb7d401850842" exitCode=0 Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.002416 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9nwr" event={"ID":"ec374614-bada-412d-a649-dc86e3ddaa43","Type":"ContainerDied","Data":"9ca43a0c328bea27302d4eba19385c4af436c21487cf3066bcbdb7d401850842"} Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.002442 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9nwr" event={"ID":"ec374614-bada-412d-a649-dc86e3ddaa43","Type":"ContainerStarted","Data":"98fadd08e2bc7f83e5476600673bad1e4a9c220263333fc724bd5a6dd4c1a2b1"} Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.168948 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.297481 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.299962 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.378823 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.714552 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q5q42"] Feb 15 17:11:19 crc kubenswrapper[4585]: W0215 17:11:19.722199 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817eac9e_3482_4350_bc11_f58ec7bad74c.slice/crio-94cbfcd95026521088c7639a4f7cdbf5e06fc004744cd0c68c7da6c41b744127 WatchSource:0}: Error finding container 94cbfcd95026521088c7639a4f7cdbf5e06fc004744cd0c68c7da6c41b744127: Status 404 returned error can't find the container with id 94cbfcd95026521088c7639a4f7cdbf5e06fc004744cd0c68c7da6c41b744127 Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.758083 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b7rq6"] Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.759138 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.798219 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7rq6"] Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.861422 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8be94f1-f28e-494b-b9c1-d97c75ea5577-utilities\") pod \"community-operators-b7rq6\" (UID: \"a8be94f1-f28e-494b-b9c1-d97c75ea5577\") " pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.861542 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p62bt\" (UniqueName: \"kubernetes.io/projected/a8be94f1-f28e-494b-b9c1-d97c75ea5577-kube-api-access-p62bt\") pod \"community-operators-b7rq6\" (UID: \"a8be94f1-f28e-494b-b9c1-d97c75ea5577\") " pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.861577 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8be94f1-f28e-494b-b9c1-d97c75ea5577-catalog-content\") pod \"community-operators-b7rq6\" (UID: \"a8be94f1-f28e-494b-b9c1-d97c75ea5577\") " pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.962395 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p62bt\" (UniqueName: \"kubernetes.io/projected/a8be94f1-f28e-494b-b9c1-d97c75ea5577-kube-api-access-p62bt\") pod \"community-operators-b7rq6\" (UID: \"a8be94f1-f28e-494b-b9c1-d97c75ea5577\") " pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.963012 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8be94f1-f28e-494b-b9c1-d97c75ea5577-catalog-content\") pod \"community-operators-b7rq6\" (UID: \"a8be94f1-f28e-494b-b9c1-d97c75ea5577\") " pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.963053 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8be94f1-f28e-494b-b9c1-d97c75ea5577-utilities\") pod \"community-operators-b7rq6\" (UID: \"a8be94f1-f28e-494b-b9c1-d97c75ea5577\") " pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.963884 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8be94f1-f28e-494b-b9c1-d97c75ea5577-utilities\") pod \"community-operators-b7rq6\" (UID: \"a8be94f1-f28e-494b-b9c1-d97c75ea5577\") " pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.963959 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8be94f1-f28e-494b-b9c1-d97c75ea5577-catalog-content\") pod \"community-operators-b7rq6\" (UID: \"a8be94f1-f28e-494b-b9c1-d97c75ea5577\") " pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:19 crc kubenswrapper[4585]: I0215 17:11:19.988793 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p62bt\" (UniqueName: \"kubernetes.io/projected/a8be94f1-f28e-494b-b9c1-d97c75ea5577-kube-api-access-p62bt\") pod \"community-operators-b7rq6\" (UID: \"a8be94f1-f28e-494b-b9c1-d97c75ea5577\") " pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.015834 4585 generic.go:334] "Generic (PLEG): container finished" podID="817eac9e-3482-4350-bc11-f58ec7bad74c" containerID="485814ddc7582669e82383ca0383056b59d5a25b158274e7a445b319e2991438" exitCode=0 Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.016087 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5q42" event={"ID":"817eac9e-3482-4350-bc11-f58ec7bad74c","Type":"ContainerDied","Data":"485814ddc7582669e82383ca0383056b59d5a25b158274e7a445b319e2991438"} Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.016937 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5q42" event={"ID":"817eac9e-3482-4350-bc11-f58ec7bad74c","Type":"ContainerStarted","Data":"94cbfcd95026521088c7639a4f7cdbf5e06fc004744cd0c68c7da6c41b744127"} Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.024516 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fvv" event={"ID":"6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b","Type":"ContainerStarted","Data":"0d96bca66eadc8f46de6d9b93235b2188e3a98fa979342ece77fe168bab33340"} Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.028982 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flrdq" event={"ID":"2324ac96-77b5-46b6-a109-22380ef0475d","Type":"ContainerStarted","Data":"90accddfda7ea397f779364192636962e41ee5afdd8e65c8334c874cde3e1ae7"} Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.073728 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.091239 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r4fvv" podStartSLOduration=3.505292639 podStartE2EDuration="6.091195699s" podCreationTimestamp="2026-02-15 17:11:14 +0000 UTC" firstStartedPulling="2026-02-15 17:11:16.916664545 +0000 UTC m=+332.860072677" lastFinishedPulling="2026-02-15 17:11:19.502567605 +0000 UTC m=+335.445975737" observedRunningTime="2026-02-15 17:11:20.089586075 +0000 UTC m=+336.032994227" watchObservedRunningTime="2026-02-15 17:11:20.091195699 +0000 UTC m=+336.034603831" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.106293 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tcrgd" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.347122 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-685gs"] Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.348697 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.376031 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-685gs"] Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.472161 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlv77\" (UniqueName: \"kubernetes.io/projected/3791091f-1f3d-4617-80c0-6a17ac645dc2-kube-api-access-qlv77\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.472441 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.472465 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3791091f-1f3d-4617-80c0-6a17ac645dc2-trusted-ca\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.472489 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3791091f-1f3d-4617-80c0-6a17ac645dc2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.472511 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3791091f-1f3d-4617-80c0-6a17ac645dc2-registry-tls\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.472540 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3791091f-1f3d-4617-80c0-6a17ac645dc2-bound-sa-token\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.472561 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3791091f-1f3d-4617-80c0-6a17ac645dc2-registry-certificates\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.472629 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3791091f-1f3d-4617-80c0-6a17ac645dc2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.554984 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.573746 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3791091f-1f3d-4617-80c0-6a17ac645dc2-bound-sa-token\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.573817 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3791091f-1f3d-4617-80c0-6a17ac645dc2-registry-certificates\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.573881 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3791091f-1f3d-4617-80c0-6a17ac645dc2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.573913 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlv77\" (UniqueName: \"kubernetes.io/projected/3791091f-1f3d-4617-80c0-6a17ac645dc2-kube-api-access-qlv77\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.573932 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3791091f-1f3d-4617-80c0-6a17ac645dc2-trusted-ca\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.573955 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3791091f-1f3d-4617-80c0-6a17ac645dc2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.573978 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3791091f-1f3d-4617-80c0-6a17ac645dc2-registry-tls\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.588446 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3791091f-1f3d-4617-80c0-6a17ac645dc2-registry-certificates\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.589425 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3791091f-1f3d-4617-80c0-6a17ac645dc2-trusted-ca\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.590179 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3791091f-1f3d-4617-80c0-6a17ac645dc2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.590764 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3791091f-1f3d-4617-80c0-6a17ac645dc2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.592475 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3791091f-1f3d-4617-80c0-6a17ac645dc2-registry-tls\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.594238 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3791091f-1f3d-4617-80c0-6a17ac645dc2-bound-sa-token\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.612425 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlv77\" (UniqueName: \"kubernetes.io/projected/3791091f-1f3d-4617-80c0-6a17ac645dc2-kube-api-access-qlv77\") pod \"image-registry-66df7c8f76-685gs\" (UID: \"3791091f-1f3d-4617-80c0-6a17ac645dc2\") " pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.678290 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.723287 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7rq6"] Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.955615 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gnzk5"] Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.957254 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:20 crc kubenswrapper[4585]: I0215 17:11:20.977691 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnzk5"] Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.035831 4585 generic.go:334] "Generic (PLEG): container finished" podID="2324ac96-77b5-46b6-a109-22380ef0475d" containerID="90accddfda7ea397f779364192636962e41ee5afdd8e65c8334c874cde3e1ae7" exitCode=0 Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.035913 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flrdq" event={"ID":"2324ac96-77b5-46b6-a109-22380ef0475d","Type":"ContainerDied","Data":"90accddfda7ea397f779364192636962e41ee5afdd8e65c8334c874cde3e1ae7"} Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.049347 4585 generic.go:334] "Generic (PLEG): container finished" podID="ec374614-bada-412d-a649-dc86e3ddaa43" containerID="b868683d43ebd1cf2267c85dd3f4f810659bc3912c18719b1585887f79a093f2" exitCode=0 Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.049761 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9nwr" event={"ID":"ec374614-bada-412d-a649-dc86e3ddaa43","Type":"ContainerDied","Data":"b868683d43ebd1cf2267c85dd3f4f810659bc3912c18719b1585887f79a093f2"} Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.055615 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5q42" event={"ID":"817eac9e-3482-4350-bc11-f58ec7bad74c","Type":"ContainerStarted","Data":"1731e29146cd8a59e40514186f7c2abbbf93c824e9912d3032ef346236ea3630"} Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.057545 4585 generic.go:334] "Generic (PLEG): container finished" podID="a8be94f1-f28e-494b-b9c1-d97c75ea5577" containerID="5f6bb448f5006e77100d6b69135d7d44c9d579f5941c6b46fa4f8285b78cbfae" exitCode=0 Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.058646 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7rq6" event={"ID":"a8be94f1-f28e-494b-b9c1-d97c75ea5577","Type":"ContainerDied","Data":"5f6bb448f5006e77100d6b69135d7d44c9d579f5941c6b46fa4f8285b78cbfae"} Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.058684 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7rq6" event={"ID":"a8be94f1-f28e-494b-b9c1-d97c75ea5577","Type":"ContainerStarted","Data":"a3294a2dfeaf55f7bee631218ccd4dbad81bd510c44a7e667fe65207eb7940ad"} Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.081489 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d95df95-a751-4c7e-adeb-a8d6e0db7767-catalog-content\") pod \"community-operators-gnzk5\" (UID: \"2d95df95-a751-4c7e-adeb-a8d6e0db7767\") " pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.081543 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d95df95-a751-4c7e-adeb-a8d6e0db7767-utilities\") pod \"community-operators-gnzk5\" (UID: \"2d95df95-a751-4c7e-adeb-a8d6e0db7767\") " pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.081648 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m9qt\" (UniqueName: \"kubernetes.io/projected/2d95df95-a751-4c7e-adeb-a8d6e0db7767-kube-api-access-5m9qt\") pod \"community-operators-gnzk5\" (UID: \"2d95df95-a751-4c7e-adeb-a8d6e0db7767\") " pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.140502 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-685gs"] Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.183212 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d95df95-a751-4c7e-adeb-a8d6e0db7767-catalog-content\") pod \"community-operators-gnzk5\" (UID: \"2d95df95-a751-4c7e-adeb-a8d6e0db7767\") " pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.183283 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d95df95-a751-4c7e-adeb-a8d6e0db7767-utilities\") pod \"community-operators-gnzk5\" (UID: \"2d95df95-a751-4c7e-adeb-a8d6e0db7767\") " pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.183402 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m9qt\" (UniqueName: \"kubernetes.io/projected/2d95df95-a751-4c7e-adeb-a8d6e0db7767-kube-api-access-5m9qt\") pod \"community-operators-gnzk5\" (UID: \"2d95df95-a751-4c7e-adeb-a8d6e0db7767\") " pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.184835 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d95df95-a751-4c7e-adeb-a8d6e0db7767-catalog-content\") pod \"community-operators-gnzk5\" (UID: \"2d95df95-a751-4c7e-adeb-a8d6e0db7767\") " pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.185575 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d95df95-a751-4c7e-adeb-a8d6e0db7767-utilities\") pod \"community-operators-gnzk5\" (UID: \"2d95df95-a751-4c7e-adeb-a8d6e0db7767\") " pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.203794 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m9qt\" (UniqueName: \"kubernetes.io/projected/2d95df95-a751-4c7e-adeb-a8d6e0db7767-kube-api-access-5m9qt\") pod \"community-operators-gnzk5\" (UID: \"2d95df95-a751-4c7e-adeb-a8d6e0db7767\") " pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.287184 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:21 crc kubenswrapper[4585]: I0215 17:11:21.618748 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnzk5"] Feb 15 17:11:21 crc kubenswrapper[4585]: W0215 17:11:21.642773 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d95df95_a751_4c7e_adeb_a8d6e0db7767.slice/crio-41caf06330c8a4a10598d831303d359baab473bd2525696eaa8bde675bfb6ad8 WatchSource:0}: Error finding container 41caf06330c8a4a10598d831303d359baab473bd2525696eaa8bde675bfb6ad8: Status 404 returned error can't find the container with id 41caf06330c8a4a10598d831303d359baab473bd2525696eaa8bde675bfb6ad8 Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.083530 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7rq6" event={"ID":"a8be94f1-f28e-494b-b9c1-d97c75ea5577","Type":"ContainerStarted","Data":"29fc1d2d00e7c793e77c2baeb295ecebf7550b240823aa7563ad405e45c72141"} Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.084928 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-685gs" event={"ID":"3791091f-1f3d-4617-80c0-6a17ac645dc2","Type":"ContainerStarted","Data":"ec072cbfd69bde4a2f216aab447e24b014ec6babd14bded5d49dc701097220bd"} Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.084973 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-685gs" event={"ID":"3791091f-1f3d-4617-80c0-6a17ac645dc2","Type":"ContainerStarted","Data":"4fa346d38a3556837c9d4140fc7cb207c568053d78c072424bb25295ef4759da"} Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.085433 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.088242 4585 generic.go:334] "Generic (PLEG): container finished" podID="2d95df95-a751-4c7e-adeb-a8d6e0db7767" containerID="c20c36e7984940e600917730f036eeb39e18a2d0c0aaf4fe9e3e875bcc9989e6" exitCode=0 Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.088292 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnzk5" event={"ID":"2d95df95-a751-4c7e-adeb-a8d6e0db7767","Type":"ContainerDied","Data":"c20c36e7984940e600917730f036eeb39e18a2d0c0aaf4fe9e3e875bcc9989e6"} Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.088308 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnzk5" event={"ID":"2d95df95-a751-4c7e-adeb-a8d6e0db7767","Type":"ContainerStarted","Data":"41caf06330c8a4a10598d831303d359baab473bd2525696eaa8bde675bfb6ad8"} Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.090347 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flrdq" event={"ID":"2324ac96-77b5-46b6-a109-22380ef0475d","Type":"ContainerStarted","Data":"c2ff2225376e8cabe3b52a75941e4052f7dc633f54635ba15ce5415f9833539b"} Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.095455 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9nwr" event={"ID":"ec374614-bada-412d-a649-dc86e3ddaa43","Type":"ContainerStarted","Data":"7b58cfa6f748a862ce7d07d9ff5be13e27cc12e6cc0e4b0c9b92ebc6f2181d75"} Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.097090 4585 generic.go:334] "Generic (PLEG): container finished" podID="817eac9e-3482-4350-bc11-f58ec7bad74c" containerID="1731e29146cd8a59e40514186f7c2abbbf93c824e9912d3032ef346236ea3630" exitCode=0 Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.097696 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5q42" event={"ID":"817eac9e-3482-4350-bc11-f58ec7bad74c","Type":"ContainerDied","Data":"1731e29146cd8a59e40514186f7c2abbbf93c824e9912d3032ef346236ea3630"} Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.161607 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-flrdq" podStartSLOduration=3.68009494 podStartE2EDuration="7.161576541s" podCreationTimestamp="2026-02-15 17:11:15 +0000 UTC" firstStartedPulling="2026-02-15 17:11:17.967898878 +0000 UTC m=+333.911307010" lastFinishedPulling="2026-02-15 17:11:21.449380479 +0000 UTC m=+337.392788611" observedRunningTime="2026-02-15 17:11:22.148226012 +0000 UTC m=+338.091634144" watchObservedRunningTime="2026-02-15 17:11:22.161576541 +0000 UTC m=+338.104984673" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.165127 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rkrdk"] Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.166333 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.196505 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rkrdk"] Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.263165 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-685gs" podStartSLOduration=2.263144162 podStartE2EDuration="2.263144162s" podCreationTimestamp="2026-02-15 17:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:11:22.260280213 +0000 UTC m=+338.203688335" watchObservedRunningTime="2026-02-15 17:11:22.263144162 +0000 UTC m=+338.206552294" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.279475 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9nwr" podStartSLOduration=2.788329407 podStartE2EDuration="5.279456652s" podCreationTimestamp="2026-02-15 17:11:17 +0000 UTC" firstStartedPulling="2026-02-15 17:11:19.002876744 +0000 UTC m=+334.946284876" lastFinishedPulling="2026-02-15 17:11:21.494003989 +0000 UTC m=+337.437412121" observedRunningTime="2026-02-15 17:11:22.278168497 +0000 UTC m=+338.221576629" watchObservedRunningTime="2026-02-15 17:11:22.279456652 +0000 UTC m=+338.222864784" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.293942 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.294229 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.320776 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db06415c-64a8-4c80-8fab-d528d407d10a-catalog-content\") pod \"community-operators-rkrdk\" (UID: \"db06415c-64a8-4c80-8fab-d528d407d10a\") " pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.320816 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6wb\" (UniqueName: \"kubernetes.io/projected/db06415c-64a8-4c80-8fab-d528d407d10a-kube-api-access-7x6wb\") pod \"community-operators-rkrdk\" (UID: \"db06415c-64a8-4c80-8fab-d528d407d10a\") " pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.320848 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db06415c-64a8-4c80-8fab-d528d407d10a-utilities\") pod \"community-operators-rkrdk\" (UID: \"db06415c-64a8-4c80-8fab-d528d407d10a\") " pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.341281 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.421981 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db06415c-64a8-4c80-8fab-d528d407d10a-utilities\") pod \"community-operators-rkrdk\" (UID: \"db06415c-64a8-4c80-8fab-d528d407d10a\") " pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.422121 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db06415c-64a8-4c80-8fab-d528d407d10a-catalog-content\") pod \"community-operators-rkrdk\" (UID: \"db06415c-64a8-4c80-8fab-d528d407d10a\") " pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.422150 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6wb\" (UniqueName: \"kubernetes.io/projected/db06415c-64a8-4c80-8fab-d528d407d10a-kube-api-access-7x6wb\") pod \"community-operators-rkrdk\" (UID: \"db06415c-64a8-4c80-8fab-d528d407d10a\") " pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.422993 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db06415c-64a8-4c80-8fab-d528d407d10a-utilities\") pod \"community-operators-rkrdk\" (UID: \"db06415c-64a8-4c80-8fab-d528d407d10a\") " pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.423521 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db06415c-64a8-4c80-8fab-d528d407d10a-catalog-content\") pod \"community-operators-rkrdk\" (UID: \"db06415c-64a8-4c80-8fab-d528d407d10a\") " pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.457011 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6wb\" (UniqueName: \"kubernetes.io/projected/db06415c-64a8-4c80-8fab-d528d407d10a-kube-api-access-7x6wb\") pod \"community-operators-rkrdk\" (UID: \"db06415c-64a8-4c80-8fab-d528d407d10a\") " pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.562689 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:22 crc kubenswrapper[4585]: I0215 17:11:22.829418 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rkrdk"] Feb 15 17:11:22 crc kubenswrapper[4585]: W0215 17:11:22.854298 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb06415c_64a8_4c80_8fab_d528d407d10a.slice/crio-5bb59b1f15fb2fbd430d8e7ccaa5840ca91a257acbdb1cf1d6b583c37dbef157 WatchSource:0}: Error finding container 5bb59b1f15fb2fbd430d8e7ccaa5840ca91a257acbdb1cf1d6b583c37dbef157: Status 404 returned error can't find the container with id 5bb59b1f15fb2fbd430d8e7ccaa5840ca91a257acbdb1cf1d6b583c37dbef157 Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.104958 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnzk5" event={"ID":"2d95df95-a751-4c7e-adeb-a8d6e0db7767","Type":"ContainerStarted","Data":"1cb68f6198e9869095050b112b3b880ed41409d7cc0c4eea5ef8bc8951e18ad4"} Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.107010 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkrdk" event={"ID":"db06415c-64a8-4c80-8fab-d528d407d10a","Type":"ContainerStarted","Data":"94eb1218e3a9e2ce02eb5878fead05cb3bea6b9321e67ba6aef2787fa72a62a6"} Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.107049 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkrdk" event={"ID":"db06415c-64a8-4c80-8fab-d528d407d10a","Type":"ContainerStarted","Data":"5bb59b1f15fb2fbd430d8e7ccaa5840ca91a257acbdb1cf1d6b583c37dbef157"} Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.109532 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5q42" event={"ID":"817eac9e-3482-4350-bc11-f58ec7bad74c","Type":"ContainerStarted","Data":"6572c899e055592f44f875f2528fd98bf4c4d5eacffac7be71f4b000e0143e20"} Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.111358 4585 generic.go:334] "Generic (PLEG): container finished" podID="a8be94f1-f28e-494b-b9c1-d97c75ea5577" containerID="29fc1d2d00e7c793e77c2baeb295ecebf7550b240823aa7563ad405e45c72141" exitCode=0 Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.111516 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7rq6" event={"ID":"a8be94f1-f28e-494b-b9c1-d97c75ea5577","Type":"ContainerDied","Data":"29fc1d2d00e7c793e77c2baeb295ecebf7550b240823aa7563ad405e45c72141"} Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.178128 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q5q42" podStartSLOduration=2.704989537 podStartE2EDuration="5.178114197s" podCreationTimestamp="2026-02-15 17:11:18 +0000 UTC" firstStartedPulling="2026-02-15 17:11:20.018048962 +0000 UTC m=+335.961457094" lastFinishedPulling="2026-02-15 17:11:22.491173622 +0000 UTC m=+338.434581754" observedRunningTime="2026-02-15 17:11:23.172197804 +0000 UTC m=+339.115605936" watchObservedRunningTime="2026-02-15 17:11:23.178114197 +0000 UTC m=+339.121522329" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.178341 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p2lqh" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.353315 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bj95v"] Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.357738 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.427938 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bj95v"] Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.449362 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f697e-9cf9-4b02-af9d-c0fe2375bd0c-catalog-content\") pod \"community-operators-bj95v\" (UID: \"550f697e-9cf9-4b02-af9d-c0fe2375bd0c\") " pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.449506 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f697e-9cf9-4b02-af9d-c0fe2375bd0c-utilities\") pod \"community-operators-bj95v\" (UID: \"550f697e-9cf9-4b02-af9d-c0fe2375bd0c\") " pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.449831 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzc5g\" (UniqueName: \"kubernetes.io/projected/550f697e-9cf9-4b02-af9d-c0fe2375bd0c-kube-api-access-qzc5g\") pod \"community-operators-bj95v\" (UID: \"550f697e-9cf9-4b02-af9d-c0fe2375bd0c\") " pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.551389 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzc5g\" (UniqueName: \"kubernetes.io/projected/550f697e-9cf9-4b02-af9d-c0fe2375bd0c-kube-api-access-qzc5g\") pod \"community-operators-bj95v\" (UID: \"550f697e-9cf9-4b02-af9d-c0fe2375bd0c\") " pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.551475 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f697e-9cf9-4b02-af9d-c0fe2375bd0c-catalog-content\") pod \"community-operators-bj95v\" (UID: \"550f697e-9cf9-4b02-af9d-c0fe2375bd0c\") " pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.551509 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f697e-9cf9-4b02-af9d-c0fe2375bd0c-utilities\") pod \"community-operators-bj95v\" (UID: \"550f697e-9cf9-4b02-af9d-c0fe2375bd0c\") " pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.552195 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/550f697e-9cf9-4b02-af9d-c0fe2375bd0c-utilities\") pod \"community-operators-bj95v\" (UID: \"550f697e-9cf9-4b02-af9d-c0fe2375bd0c\") " pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.552921 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/550f697e-9cf9-4b02-af9d-c0fe2375bd0c-catalog-content\") pod \"community-operators-bj95v\" (UID: \"550f697e-9cf9-4b02-af9d-c0fe2375bd0c\") " pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.581652 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzc5g\" (UniqueName: \"kubernetes.io/projected/550f697e-9cf9-4b02-af9d-c0fe2375bd0c-kube-api-access-qzc5g\") pod \"community-operators-bj95v\" (UID: \"550f697e-9cf9-4b02-af9d-c0fe2375bd0c\") " pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.679757 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.679841 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.704798 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:23 crc kubenswrapper[4585]: I0215 17:11:23.738082 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.010696 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bj95v"] Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.118141 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj95v" event={"ID":"550f697e-9cf9-4b02-af9d-c0fe2375bd0c","Type":"ContainerStarted","Data":"7a7bbe612e285ac0626236e6349c3085c35fc8f1baee8403f9adfbb9b6eb7ef3"} Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.121256 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7rq6" event={"ID":"a8be94f1-f28e-494b-b9c1-d97c75ea5577","Type":"ContainerStarted","Data":"b1d0e65d8a5950a843b7ea23a3d6615cc0b99f68508ee91f7b1059f264da71e6"} Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.123366 4585 generic.go:334] "Generic (PLEG): container finished" podID="2d95df95-a751-4c7e-adeb-a8d6e0db7767" containerID="1cb68f6198e9869095050b112b3b880ed41409d7cc0c4eea5ef8bc8951e18ad4" exitCode=0 Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.123419 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnzk5" event={"ID":"2d95df95-a751-4c7e-adeb-a8d6e0db7767","Type":"ContainerDied","Data":"1cb68f6198e9869095050b112b3b880ed41409d7cc0c4eea5ef8bc8951e18ad4"} Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.125429 4585 generic.go:334] "Generic (PLEG): container finished" podID="db06415c-64a8-4c80-8fab-d528d407d10a" containerID="94eb1218e3a9e2ce02eb5878fead05cb3bea6b9321e67ba6aef2787fa72a62a6" exitCode=0 Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.125583 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkrdk" event={"ID":"db06415c-64a8-4c80-8fab-d528d407d10a","Type":"ContainerDied","Data":"94eb1218e3a9e2ce02eb5878fead05cb3bea6b9321e67ba6aef2787fa72a62a6"} Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.223278 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fcdtv" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.250569 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b7rq6" podStartSLOduration=2.780223333 podStartE2EDuration="5.250552015s" podCreationTimestamp="2026-02-15 17:11:19 +0000 UTC" firstStartedPulling="2026-02-15 17:11:21.059582788 +0000 UTC m=+337.002990920" lastFinishedPulling="2026-02-15 17:11:23.52991147 +0000 UTC m=+339.473319602" observedRunningTime="2026-02-15 17:11:24.210397098 +0000 UTC m=+340.153805230" watchObservedRunningTime="2026-02-15 17:11:24.250552015 +0000 UTC m=+340.193960147" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.549061 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-szkmh"] Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.550212 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.616217 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szkmh"] Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.670025 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd-utilities\") pod \"community-operators-szkmh\" (UID: \"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd\") " pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.670086 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnf4z\" (UniqueName: \"kubernetes.io/projected/4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd-kube-api-access-xnf4z\") pod \"community-operators-szkmh\" (UID: \"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd\") " pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.670168 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd-catalog-content\") pod \"community-operators-szkmh\" (UID: \"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd\") " pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.771360 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd-catalog-content\") pod \"community-operators-szkmh\" (UID: \"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd\") " pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.771758 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd-utilities\") pod \"community-operators-szkmh\" (UID: \"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd\") " pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.771783 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnf4z\" (UniqueName: \"kubernetes.io/projected/4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd-kube-api-access-xnf4z\") pod \"community-operators-szkmh\" (UID: \"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd\") " pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.772767 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd-catalog-content\") pod \"community-operators-szkmh\" (UID: \"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd\") " pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.772987 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd-utilities\") pod \"community-operators-szkmh\" (UID: \"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd\") " pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.805504 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnf4z\" (UniqueName: \"kubernetes.io/projected/4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd-kube-api-access-xnf4z\") pod \"community-operators-szkmh\" (UID: \"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd\") " pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.863495 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.863540 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.863757 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:24 crc kubenswrapper[4585]: I0215 17:11:24.927869 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.163881 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnzk5" event={"ID":"2d95df95-a751-4c7e-adeb-a8d6e0db7767","Type":"ContainerStarted","Data":"b32a3c2420c078e38131ef36db0153440698cde4a6cf9eb00913e77513c7d288"} Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.180778 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkrdk" event={"ID":"db06415c-64a8-4c80-8fab-d528d407d10a","Type":"ContainerStarted","Data":"02ef06c13d84a13783f6df2e5ebcd8ac7897f1521f86178bfbc58de452109d12"} Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.191966 4585 generic.go:334] "Generic (PLEG): container finished" podID="550f697e-9cf9-4b02-af9d-c0fe2375bd0c" containerID="9eda7a44aba3db82f2eaab84f24c342a92e98d2c5373eb4676188608d4cc7ccf" exitCode=0 Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.192011 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gnzk5" podStartSLOduration=2.739663776 podStartE2EDuration="5.191998311s" podCreationTimestamp="2026-02-15 17:11:20 +0000 UTC" firstStartedPulling="2026-02-15 17:11:22.090965494 +0000 UTC m=+338.034373626" lastFinishedPulling="2026-02-15 17:11:24.543300029 +0000 UTC m=+340.486708161" observedRunningTime="2026-02-15 17:11:25.190830339 +0000 UTC m=+341.134238471" watchObservedRunningTime="2026-02-15 17:11:25.191998311 +0000 UTC m=+341.135406443" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.192027 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj95v" event={"ID":"550f697e-9cf9-4b02-af9d-c0fe2375bd0c","Type":"ContainerDied","Data":"9eda7a44aba3db82f2eaab84f24c342a92e98d2c5373eb4676188608d4cc7ccf"} Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.283730 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r4fvv" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.404715 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szkmh"] Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.750272 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9lz2d"] Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.751356 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.765824 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9lz2d"] Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.803097 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0915232d-c037-4e97-9ad2-a0e1f69b913a-utilities\") pod \"community-operators-9lz2d\" (UID: \"0915232d-c037-4e97-9ad2-a0e1f69b913a\") " pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.803134 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0915232d-c037-4e97-9ad2-a0e1f69b913a-catalog-content\") pod \"community-operators-9lz2d\" (UID: \"0915232d-c037-4e97-9ad2-a0e1f69b913a\") " pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.803177 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8cs8\" (UniqueName: \"kubernetes.io/projected/0915232d-c037-4e97-9ad2-a0e1f69b913a-kube-api-access-r8cs8\") pod \"community-operators-9lz2d\" (UID: \"0915232d-c037-4e97-9ad2-a0e1f69b913a\") " pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.905141 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8cs8\" (UniqueName: \"kubernetes.io/projected/0915232d-c037-4e97-9ad2-a0e1f69b913a-kube-api-access-r8cs8\") pod \"community-operators-9lz2d\" (UID: \"0915232d-c037-4e97-9ad2-a0e1f69b913a\") " pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.905256 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0915232d-c037-4e97-9ad2-a0e1f69b913a-utilities\") pod \"community-operators-9lz2d\" (UID: \"0915232d-c037-4e97-9ad2-a0e1f69b913a\") " pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.905274 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0915232d-c037-4e97-9ad2-a0e1f69b913a-catalog-content\") pod \"community-operators-9lz2d\" (UID: \"0915232d-c037-4e97-9ad2-a0e1f69b913a\") " pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.905686 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0915232d-c037-4e97-9ad2-a0e1f69b913a-catalog-content\") pod \"community-operators-9lz2d\" (UID: \"0915232d-c037-4e97-9ad2-a0e1f69b913a\") " pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.906137 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0915232d-c037-4e97-9ad2-a0e1f69b913a-utilities\") pod \"community-operators-9lz2d\" (UID: \"0915232d-c037-4e97-9ad2-a0e1f69b913a\") " pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:25 crc kubenswrapper[4585]: I0215 17:11:25.928715 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8cs8\" (UniqueName: \"kubernetes.io/projected/0915232d-c037-4e97-9ad2-a0e1f69b913a-kube-api-access-r8cs8\") pod \"community-operators-9lz2d\" (UID: \"0915232d-c037-4e97-9ad2-a0e1f69b913a\") " pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.070876 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.112709 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.113427 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.180255 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.200653 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj95v" event={"ID":"550f697e-9cf9-4b02-af9d-c0fe2375bd0c","Type":"ContainerStarted","Data":"13eb0aca287d877f4d18b17b94c9b229860c337a2301f324416362c395e90eda"} Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.202439 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szkmh" event={"ID":"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd","Type":"ContainerStarted","Data":"9ef4ae436abe8fe2aa989f87c49d11d5b528753df96e8b97999731ff233562d7"} Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.203551 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szkmh" event={"ID":"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd","Type":"ContainerStarted","Data":"6e7b5204f79ffc21afdbfb9d0d1f1a5b3b81c96d175e1ee25666381b6d8164fc"} Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.283189 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-flrdq" Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.557813 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9lz2d"] Feb 15 17:11:26 crc kubenswrapper[4585]: W0215 17:11:26.570223 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0915232d_c037_4e97_9ad2_a0e1f69b913a.slice/crio-c629483a3ac3bcd6ae43c2d6ad6f916a7006613b5651c4358c15d52dd07ac038 WatchSource:0}: Error finding container c629483a3ac3bcd6ae43c2d6ad6f916a7006613b5651c4358c15d52dd07ac038: Status 404 returned error can't find the container with id c629483a3ac3bcd6ae43c2d6ad6f916a7006613b5651c4358c15d52dd07ac038 Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.955857 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dssfd"] Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.957477 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:26 crc kubenswrapper[4585]: I0215 17:11:26.970591 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dssfd"] Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.026042 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fd9f89-0b8e-4e77-b386-30107cd20578-catalog-content\") pod \"community-operators-dssfd\" (UID: \"57fd9f89-0b8e-4e77-b386-30107cd20578\") " pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.026380 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fd9f89-0b8e-4e77-b386-30107cd20578-utilities\") pod \"community-operators-dssfd\" (UID: \"57fd9f89-0b8e-4e77-b386-30107cd20578\") " pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.026511 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hmzq\" (UniqueName: \"kubernetes.io/projected/57fd9f89-0b8e-4e77-b386-30107cd20578-kube-api-access-6hmzq\") pod \"community-operators-dssfd\" (UID: \"57fd9f89-0b8e-4e77-b386-30107cd20578\") " pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.128469 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fd9f89-0b8e-4e77-b386-30107cd20578-utilities\") pod \"community-operators-dssfd\" (UID: \"57fd9f89-0b8e-4e77-b386-30107cd20578\") " pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.128546 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hmzq\" (UniqueName: \"kubernetes.io/projected/57fd9f89-0b8e-4e77-b386-30107cd20578-kube-api-access-6hmzq\") pod \"community-operators-dssfd\" (UID: \"57fd9f89-0b8e-4e77-b386-30107cd20578\") " pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.128587 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fd9f89-0b8e-4e77-b386-30107cd20578-catalog-content\") pod \"community-operators-dssfd\" (UID: \"57fd9f89-0b8e-4e77-b386-30107cd20578\") " pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.129791 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57fd9f89-0b8e-4e77-b386-30107cd20578-utilities\") pod \"community-operators-dssfd\" (UID: \"57fd9f89-0b8e-4e77-b386-30107cd20578\") " pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.129922 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57fd9f89-0b8e-4e77-b386-30107cd20578-catalog-content\") pod \"community-operators-dssfd\" (UID: \"57fd9f89-0b8e-4e77-b386-30107cd20578\") " pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.151299 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hmzq\" (UniqueName: \"kubernetes.io/projected/57fd9f89-0b8e-4e77-b386-30107cd20578-kube-api-access-6hmzq\") pod \"community-operators-dssfd\" (UID: \"57fd9f89-0b8e-4e77-b386-30107cd20578\") " pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.210305 4585 generic.go:334] "Generic (PLEG): container finished" podID="db06415c-64a8-4c80-8fab-d528d407d10a" containerID="02ef06c13d84a13783f6df2e5ebcd8ac7897f1521f86178bfbc58de452109d12" exitCode=0 Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.210390 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkrdk" event={"ID":"db06415c-64a8-4c80-8fab-d528d407d10a","Type":"ContainerDied","Data":"02ef06c13d84a13783f6df2e5ebcd8ac7897f1521f86178bfbc58de452109d12"} Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.214104 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lz2d" event={"ID":"0915232d-c037-4e97-9ad2-a0e1f69b913a","Type":"ContainerStarted","Data":"c629483a3ac3bcd6ae43c2d6ad6f916a7006613b5651c4358c15d52dd07ac038"} Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.216001 4585 generic.go:334] "Generic (PLEG): container finished" podID="4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd" containerID="9ef4ae436abe8fe2aa989f87c49d11d5b528753df96e8b97999731ff233562d7" exitCode=0 Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.216050 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szkmh" event={"ID":"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd","Type":"ContainerDied","Data":"9ef4ae436abe8fe2aa989f87c49d11d5b528753df96e8b97999731ff233562d7"} Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.486514 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.486704 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.541064 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.690914 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:27 crc kubenswrapper[4585]: I0215 17:11:27.967401 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dssfd"] Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.148451 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-44gbl"] Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.149512 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.165906 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-44gbl"] Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.223361 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rkrdk" event={"ID":"db06415c-64a8-4c80-8fab-d528d407d10a","Type":"ContainerStarted","Data":"fabd2a16ae3586e0d112225fa2ff25211f36cab1966ff43e5a859d928f6c0e6d"} Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.226039 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lz2d" event={"ID":"0915232d-c037-4e97-9ad2-a0e1f69b913a","Type":"ContainerStarted","Data":"682878cc4e2e575b8173a754fe7e4b2458884f444b4a4d6dd7e83b95307a8222"} Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.229310 4585 generic.go:334] "Generic (PLEG): container finished" podID="550f697e-9cf9-4b02-af9d-c0fe2375bd0c" containerID="13eb0aca287d877f4d18b17b94c9b229860c337a2301f324416362c395e90eda" exitCode=0 Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.229379 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj95v" event={"ID":"550f697e-9cf9-4b02-af9d-c0fe2375bd0c","Type":"ContainerDied","Data":"13eb0aca287d877f4d18b17b94c9b229860c337a2301f324416362c395e90eda"} Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.232832 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dssfd" event={"ID":"57fd9f89-0b8e-4e77-b386-30107cd20578","Type":"ContainerStarted","Data":"f28c98896975262c21cdb96a05ef0c3669d0631e82d545b357c006871aed9a9a"} Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.234919 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szkmh" event={"ID":"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd","Type":"ContainerStarted","Data":"b5716c5ad307acccdb8a81c5ae248f915fd7eeb9317b809fe2933a73449dbc99"} Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.270955 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2d833c8-fb60-4668-82c2-fc2fbb186540-utilities\") pod \"community-operators-44gbl\" (UID: \"b2d833c8-fb60-4668-82c2-fc2fbb186540\") " pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.271046 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2d833c8-fb60-4668-82c2-fc2fbb186540-catalog-content\") pod \"community-operators-44gbl\" (UID: \"b2d833c8-fb60-4668-82c2-fc2fbb186540\") " pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.271119 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q7q5\" (UniqueName: \"kubernetes.io/projected/b2d833c8-fb60-4668-82c2-fc2fbb186540-kube-api-access-8q7q5\") pod \"community-operators-44gbl\" (UID: \"b2d833c8-fb60-4668-82c2-fc2fbb186540\") " pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.308901 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rkrdk" podStartSLOduration=2.800822913 podStartE2EDuration="6.308878145s" podCreationTimestamp="2026-02-15 17:11:22 +0000 UTC" firstStartedPulling="2026-02-15 17:11:24.127320347 +0000 UTC m=+340.070728479" lastFinishedPulling="2026-02-15 17:11:27.635375579 +0000 UTC m=+343.578783711" observedRunningTime="2026-02-15 17:11:28.273034877 +0000 UTC m=+344.216443009" watchObservedRunningTime="2026-02-15 17:11:28.308878145 +0000 UTC m=+344.252286277" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.315687 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9nwr" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.384781 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2d833c8-fb60-4668-82c2-fc2fbb186540-utilities\") pod \"community-operators-44gbl\" (UID: \"b2d833c8-fb60-4668-82c2-fc2fbb186540\") " pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.389171 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2d833c8-fb60-4668-82c2-fc2fbb186540-catalog-content\") pod \"community-operators-44gbl\" (UID: \"b2d833c8-fb60-4668-82c2-fc2fbb186540\") " pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.385581 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2d833c8-fb60-4668-82c2-fc2fbb186540-utilities\") pod \"community-operators-44gbl\" (UID: \"b2d833c8-fb60-4668-82c2-fc2fbb186540\") " pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.389624 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q7q5\" (UniqueName: \"kubernetes.io/projected/b2d833c8-fb60-4668-82c2-fc2fbb186540-kube-api-access-8q7q5\") pod \"community-operators-44gbl\" (UID: \"b2d833c8-fb60-4668-82c2-fc2fbb186540\") " pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.391279 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2d833c8-fb60-4668-82c2-fc2fbb186540-catalog-content\") pod \"community-operators-44gbl\" (UID: \"b2d833c8-fb60-4668-82c2-fc2fbb186540\") " pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.420437 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q7q5\" (UniqueName: \"kubernetes.io/projected/b2d833c8-fb60-4668-82c2-fc2fbb186540-kube-api-access-8q7q5\") pod \"community-operators-44gbl\" (UID: \"b2d833c8-fb60-4668-82c2-fc2fbb186540\") " pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.506173 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:28 crc kubenswrapper[4585]: I0215 17:11:28.953624 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-44gbl"] Feb 15 17:11:28 crc kubenswrapper[4585]: W0215 17:11:28.966091 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2d833c8_fb60_4668_82c2_fc2fbb186540.slice/crio-c59ec87871eacd5c26153e497d37be925eaa97786d0e3d560d86c9104bddf6ee WatchSource:0}: Error finding container c59ec87871eacd5c26153e497d37be925eaa97786d0e3d560d86c9104bddf6ee: Status 404 returned error can't find the container with id c59ec87871eacd5c26153e497d37be925eaa97786d0e3d560d86c9104bddf6ee Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.170055 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.170657 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.219800 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.244029 4585 generic.go:334] "Generic (PLEG): container finished" podID="0915232d-c037-4e97-9ad2-a0e1f69b913a" containerID="682878cc4e2e575b8173a754fe7e4b2458884f444b4a4d6dd7e83b95307a8222" exitCode=0 Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.244092 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lz2d" event={"ID":"0915232d-c037-4e97-9ad2-a0e1f69b913a","Type":"ContainerDied","Data":"682878cc4e2e575b8173a754fe7e4b2458884f444b4a4d6dd7e83b95307a8222"} Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.246381 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44gbl" event={"ID":"b2d833c8-fb60-4668-82c2-fc2fbb186540","Type":"ContainerStarted","Data":"c59ec87871eacd5c26153e497d37be925eaa97786d0e3d560d86c9104bddf6ee"} Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.249829 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj95v" event={"ID":"550f697e-9cf9-4b02-af9d-c0fe2375bd0c","Type":"ContainerStarted","Data":"77077debe060a7ca4263b04566b9b053721130734f87b76b3fae73408cb1d49b"} Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.251696 4585 generic.go:334] "Generic (PLEG): container finished" podID="57fd9f89-0b8e-4e77-b386-30107cd20578" containerID="7e33e5851e22e832a81f1ce9cb0de228d6fef0331c1e7d4947a2b2b275e20e0f" exitCode=0 Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.251737 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dssfd" event={"ID":"57fd9f89-0b8e-4e77-b386-30107cd20578","Type":"ContainerDied","Data":"7e33e5851e22e832a81f1ce9cb0de228d6fef0331c1e7d4947a2b2b275e20e0f"} Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.254355 4585 generic.go:334] "Generic (PLEG): container finished" podID="4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd" containerID="b5716c5ad307acccdb8a81c5ae248f915fd7eeb9317b809fe2933a73449dbc99" exitCode=0 Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.255211 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szkmh" event={"ID":"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd","Type":"ContainerDied","Data":"b5716c5ad307acccdb8a81c5ae248f915fd7eeb9317b809fe2933a73449dbc99"} Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.302431 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bj95v" podStartSLOduration=2.795008192 podStartE2EDuration="6.302411587s" podCreationTimestamp="2026-02-15 17:11:23 +0000 UTC" firstStartedPulling="2026-02-15 17:11:25.196756102 +0000 UTC m=+341.140164234" lastFinishedPulling="2026-02-15 17:11:28.704159497 +0000 UTC m=+344.647567629" observedRunningTime="2026-02-15 17:11:29.298021886 +0000 UTC m=+345.241430028" watchObservedRunningTime="2026-02-15 17:11:29.302411587 +0000 UTC m=+345.245819719" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.303308 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q5q42" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.359541 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gkd28"] Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.361514 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.377517 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gkd28"] Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.506392 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c258e26a-e1d4-4c57-9808-f3befbd2aab0-catalog-content\") pod \"community-operators-gkd28\" (UID: \"c258e26a-e1d4-4c57-9808-f3befbd2aab0\") " pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.506457 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c258e26a-e1d4-4c57-9808-f3befbd2aab0-utilities\") pod \"community-operators-gkd28\" (UID: \"c258e26a-e1d4-4c57-9808-f3befbd2aab0\") " pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.506639 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndgzc\" (UniqueName: \"kubernetes.io/projected/c258e26a-e1d4-4c57-9808-f3befbd2aab0-kube-api-access-ndgzc\") pod \"community-operators-gkd28\" (UID: \"c258e26a-e1d4-4c57-9808-f3befbd2aab0\") " pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.609844 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c258e26a-e1d4-4c57-9808-f3befbd2aab0-catalog-content\") pod \"community-operators-gkd28\" (UID: \"c258e26a-e1d4-4c57-9808-f3befbd2aab0\") " pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.609916 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c258e26a-e1d4-4c57-9808-f3befbd2aab0-utilities\") pod \"community-operators-gkd28\" (UID: \"c258e26a-e1d4-4c57-9808-f3befbd2aab0\") " pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.609955 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndgzc\" (UniqueName: \"kubernetes.io/projected/c258e26a-e1d4-4c57-9808-f3befbd2aab0-kube-api-access-ndgzc\") pod \"community-operators-gkd28\" (UID: \"c258e26a-e1d4-4c57-9808-f3befbd2aab0\") " pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.610542 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c258e26a-e1d4-4c57-9808-f3befbd2aab0-catalog-content\") pod \"community-operators-gkd28\" (UID: \"c258e26a-e1d4-4c57-9808-f3befbd2aab0\") " pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.610774 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c258e26a-e1d4-4c57-9808-f3befbd2aab0-utilities\") pod \"community-operators-gkd28\" (UID: \"c258e26a-e1d4-4c57-9808-f3befbd2aab0\") " pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.642106 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndgzc\" (UniqueName: \"kubernetes.io/projected/c258e26a-e1d4-4c57-9808-f3befbd2aab0-kube-api-access-ndgzc\") pod \"community-operators-gkd28\" (UID: \"c258e26a-e1d4-4c57-9808-f3befbd2aab0\") " pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:29 crc kubenswrapper[4585]: I0215 17:11:29.722452 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.074743 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.116093 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.132257 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.260727 4585 generic.go:334] "Generic (PLEG): container finished" podID="b2d833c8-fb60-4668-82c2-fc2fbb186540" containerID="a8754ab73cd42d3e4193dbb9ec7d18399e6334070f6d3b438386b96e687993ea" exitCode=0 Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.263250 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44gbl" event={"ID":"b2d833c8-fb60-4668-82c2-fc2fbb186540","Type":"ContainerDied","Data":"a8754ab73cd42d3e4193dbb9ec7d18399e6334070f6d3b438386b96e687993ea"} Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.321506 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b7rq6" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.468987 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gkd28"] Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.555799 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hd7nq"] Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.557281 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.573770 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hd7nq"] Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.632294 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c570ec68-3a6e-45fc-b218-7ac142f88dda-utilities\") pod \"community-operators-hd7nq\" (UID: \"c570ec68-3a6e-45fc-b218-7ac142f88dda\") " pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.632358 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hdz8\" (UniqueName: \"kubernetes.io/projected/c570ec68-3a6e-45fc-b218-7ac142f88dda-kube-api-access-5hdz8\") pod \"community-operators-hd7nq\" (UID: \"c570ec68-3a6e-45fc-b218-7ac142f88dda\") " pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.632406 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c570ec68-3a6e-45fc-b218-7ac142f88dda-catalog-content\") pod \"community-operators-hd7nq\" (UID: \"c570ec68-3a6e-45fc-b218-7ac142f88dda\") " pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.733874 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c570ec68-3a6e-45fc-b218-7ac142f88dda-utilities\") pod \"community-operators-hd7nq\" (UID: \"c570ec68-3a6e-45fc-b218-7ac142f88dda\") " pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.733937 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hdz8\" (UniqueName: \"kubernetes.io/projected/c570ec68-3a6e-45fc-b218-7ac142f88dda-kube-api-access-5hdz8\") pod \"community-operators-hd7nq\" (UID: \"c570ec68-3a6e-45fc-b218-7ac142f88dda\") " pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.734020 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c570ec68-3a6e-45fc-b218-7ac142f88dda-catalog-content\") pod \"community-operators-hd7nq\" (UID: \"c570ec68-3a6e-45fc-b218-7ac142f88dda\") " pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.734521 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c570ec68-3a6e-45fc-b218-7ac142f88dda-catalog-content\") pod \"community-operators-hd7nq\" (UID: \"c570ec68-3a6e-45fc-b218-7ac142f88dda\") " pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.734832 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c570ec68-3a6e-45fc-b218-7ac142f88dda-utilities\") pod \"community-operators-hd7nq\" (UID: \"c570ec68-3a6e-45fc-b218-7ac142f88dda\") " pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.760803 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hdz8\" (UniqueName: \"kubernetes.io/projected/c570ec68-3a6e-45fc-b218-7ac142f88dda-kube-api-access-5hdz8\") pod \"community-operators-hd7nq\" (UID: \"c570ec68-3a6e-45fc-b218-7ac142f88dda\") " pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:30 crc kubenswrapper[4585]: I0215 17:11:30.888043 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.275063 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gkd28" event={"ID":"c258e26a-e1d4-4c57-9808-f3befbd2aab0","Type":"ContainerStarted","Data":"098a997bff6fd3de4784208f84b9250a5507e0c4df6df57bfbc3ad3c279d00d2"} Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.275674 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gkd28" event={"ID":"c258e26a-e1d4-4c57-9808-f3befbd2aab0","Type":"ContainerStarted","Data":"d5413f7078d657eecc6b85030df8d8a83cd6447725d34c65cd3f113bb8b88240"} Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.284123 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lz2d" event={"ID":"0915232d-c037-4e97-9ad2-a0e1f69b913a","Type":"ContainerStarted","Data":"93d313cd307ccc0d87eb9717845242e14db319f88507e7ee787941814a1970d2"} Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.285771 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44gbl" event={"ID":"b2d833c8-fb60-4668-82c2-fc2fbb186540","Type":"ContainerStarted","Data":"f059a79ad7ae1b351a183a04fcc5493eb1c65510ffceab16bdfb7ec4e77bece4"} Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.287273 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.287720 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dssfd" event={"ID":"57fd9f89-0b8e-4e77-b386-30107cd20578","Type":"ContainerStarted","Data":"4ff8b78ad7681b2535385121757921d0f273b736a85b395ddb575fd95b705925"} Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.288078 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.296694 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szkmh" event={"ID":"4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd","Type":"ContainerStarted","Data":"5971f2a9e15de1eacd3ac3f7a1d3cb5fc001df0d95285f1fb347369271bc8336"} Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.403853 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.423269 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-szkmh" podStartSLOduration=4.721035792 podStartE2EDuration="7.42324828s" podCreationTimestamp="2026-02-15 17:11:24 +0000 UTC" firstStartedPulling="2026-02-15 17:11:27.217839954 +0000 UTC m=+343.161248096" lastFinishedPulling="2026-02-15 17:11:29.920052452 +0000 UTC m=+345.863460584" observedRunningTime="2026-02-15 17:11:31.422125219 +0000 UTC m=+347.365533351" watchObservedRunningTime="2026-02-15 17:11:31.42324828 +0000 UTC m=+347.366656412" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.486104 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hd7nq"] Feb 15 17:11:31 crc kubenswrapper[4585]: W0215 17:11:31.490932 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc570ec68_3a6e_45fc_b218_7ac142f88dda.slice/crio-6a04d63ead2c42c2bac878647f1353a1709a1d1169cb20ea7b70dfd7945b14c8 WatchSource:0}: Error finding container 6a04d63ead2c42c2bac878647f1353a1709a1d1169cb20ea7b70dfd7945b14c8: Status 404 returned error can't find the container with id 6a04d63ead2c42c2bac878647f1353a1709a1d1169cb20ea7b70dfd7945b14c8 Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.750859 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jgz2x"] Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.752188 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.766744 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jgz2x"] Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.854575 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886dc957-8460-4c2b-99a3-92b9d869a6cd-catalog-content\") pod \"community-operators-jgz2x\" (UID: \"886dc957-8460-4c2b-99a3-92b9d869a6cd\") " pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.854677 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886dc957-8460-4c2b-99a3-92b9d869a6cd-utilities\") pod \"community-operators-jgz2x\" (UID: \"886dc957-8460-4c2b-99a3-92b9d869a6cd\") " pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.854801 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhq8h\" (UniqueName: \"kubernetes.io/projected/886dc957-8460-4c2b-99a3-92b9d869a6cd-kube-api-access-fhq8h\") pod \"community-operators-jgz2x\" (UID: \"886dc957-8460-4c2b-99a3-92b9d869a6cd\") " pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.955665 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886dc957-8460-4c2b-99a3-92b9d869a6cd-catalog-content\") pod \"community-operators-jgz2x\" (UID: \"886dc957-8460-4c2b-99a3-92b9d869a6cd\") " pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.955723 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886dc957-8460-4c2b-99a3-92b9d869a6cd-utilities\") pod \"community-operators-jgz2x\" (UID: \"886dc957-8460-4c2b-99a3-92b9d869a6cd\") " pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.955756 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhq8h\" (UniqueName: \"kubernetes.io/projected/886dc957-8460-4c2b-99a3-92b9d869a6cd-kube-api-access-fhq8h\") pod \"community-operators-jgz2x\" (UID: \"886dc957-8460-4c2b-99a3-92b9d869a6cd\") " pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.956331 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/886dc957-8460-4c2b-99a3-92b9d869a6cd-catalog-content\") pod \"community-operators-jgz2x\" (UID: \"886dc957-8460-4c2b-99a3-92b9d869a6cd\") " pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:31 crc kubenswrapper[4585]: I0215 17:11:31.956545 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/886dc957-8460-4c2b-99a3-92b9d869a6cd-utilities\") pod \"community-operators-jgz2x\" (UID: \"886dc957-8460-4c2b-99a3-92b9d869a6cd\") " pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.088573 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhq8h\" (UniqueName: \"kubernetes.io/projected/886dc957-8460-4c2b-99a3-92b9d869a6cd-kube-api-access-fhq8h\") pod \"community-operators-jgz2x\" (UID: \"886dc957-8460-4c2b-99a3-92b9d869a6cd\") " pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.114544 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.316678 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd7nq" event={"ID":"c570ec68-3a6e-45fc-b218-7ac142f88dda","Type":"ContainerStarted","Data":"e5be242f934087497f5dc0c4049888980a45a5d9726becce355e85709e08ec3b"} Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.317121 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd7nq" event={"ID":"c570ec68-3a6e-45fc-b218-7ac142f88dda","Type":"ContainerStarted","Data":"6a04d63ead2c42c2bac878647f1353a1709a1d1169cb20ea7b70dfd7945b14c8"} Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.323030 4585 generic.go:334] "Generic (PLEG): container finished" podID="c258e26a-e1d4-4c57-9808-f3befbd2aab0" containerID="098a997bff6fd3de4784208f84b9250a5507e0c4df6df57bfbc3ad3c279d00d2" exitCode=0 Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.323140 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gkd28" event={"ID":"c258e26a-e1d4-4c57-9808-f3befbd2aab0","Type":"ContainerDied","Data":"098a997bff6fd3de4784208f84b9250a5507e0c4df6df57bfbc3ad3c279d00d2"} Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.358642 4585 generic.go:334] "Generic (PLEG): container finished" podID="0915232d-c037-4e97-9ad2-a0e1f69b913a" containerID="93d313cd307ccc0d87eb9717845242e14db319f88507e7ee787941814a1970d2" exitCode=0 Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.358772 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lz2d" event={"ID":"0915232d-c037-4e97-9ad2-a0e1f69b913a","Type":"ContainerDied","Data":"93d313cd307ccc0d87eb9717845242e14db319f88507e7ee787941814a1970d2"} Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.377956 4585 generic.go:334] "Generic (PLEG): container finished" podID="57fd9f89-0b8e-4e77-b386-30107cd20578" containerID="4ff8b78ad7681b2535385121757921d0f273b736a85b395ddb575fd95b705925" exitCode=0 Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.380174 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dssfd" event={"ID":"57fd9f89-0b8e-4e77-b386-30107cd20578","Type":"ContainerDied","Data":"4ff8b78ad7681b2535385121757921d0f273b736a85b395ddb575fd95b705925"} Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.443119 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gnzk5" Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.557889 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jgz2x"] Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.563817 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.564820 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:32 crc kubenswrapper[4585]: W0215 17:11:32.569400 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886dc957_8460_4c2b_99a3_92b9d869a6cd.slice/crio-1d472d24b2b8a2f19d1e146799244797ed0f832ffa782e1199e9a2952f826f32 WatchSource:0}: Error finding container 1d472d24b2b8a2f19d1e146799244797ed0f832ffa782e1199e9a2952f826f32: Status 404 returned error can't find the container with id 1d472d24b2b8a2f19d1e146799244797ed0f832ffa782e1199e9a2952f826f32 Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.626254 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.959187 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cgcn2"] Feb 15 17:11:32 crc kubenswrapper[4585]: I0215 17:11:32.960869 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.000254 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cgcn2"] Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.092623 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/191919ef-1328-4e66-9194-d59f017c027f-utilities\") pod \"community-operators-cgcn2\" (UID: \"191919ef-1328-4e66-9194-d59f017c027f\") " pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.092838 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/191919ef-1328-4e66-9194-d59f017c027f-catalog-content\") pod \"community-operators-cgcn2\" (UID: \"191919ef-1328-4e66-9194-d59f017c027f\") " pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.092898 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkfmc\" (UniqueName: \"kubernetes.io/projected/191919ef-1328-4e66-9194-d59f017c027f-kube-api-access-lkfmc\") pod \"community-operators-cgcn2\" (UID: \"191919ef-1328-4e66-9194-d59f017c027f\") " pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.194171 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/191919ef-1328-4e66-9194-d59f017c027f-catalog-content\") pod \"community-operators-cgcn2\" (UID: \"191919ef-1328-4e66-9194-d59f017c027f\") " pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.194244 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkfmc\" (UniqueName: \"kubernetes.io/projected/191919ef-1328-4e66-9194-d59f017c027f-kube-api-access-lkfmc\") pod \"community-operators-cgcn2\" (UID: \"191919ef-1328-4e66-9194-d59f017c027f\") " pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.194287 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/191919ef-1328-4e66-9194-d59f017c027f-utilities\") pod \"community-operators-cgcn2\" (UID: \"191919ef-1328-4e66-9194-d59f017c027f\") " pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.194903 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/191919ef-1328-4e66-9194-d59f017c027f-utilities\") pod \"community-operators-cgcn2\" (UID: \"191919ef-1328-4e66-9194-d59f017c027f\") " pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.195217 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/191919ef-1328-4e66-9194-d59f017c027f-catalog-content\") pod \"community-operators-cgcn2\" (UID: \"191919ef-1328-4e66-9194-d59f017c027f\") " pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.217488 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkfmc\" (UniqueName: \"kubernetes.io/projected/191919ef-1328-4e66-9194-d59f017c027f-kube-api-access-lkfmc\") pod \"community-operators-cgcn2\" (UID: \"191919ef-1328-4e66-9194-d59f017c027f\") " pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.332774 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.388156 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9lz2d" event={"ID":"0915232d-c037-4e97-9ad2-a0e1f69b913a","Type":"ContainerStarted","Data":"5f58c2dc7159c8833a42c670f3c7380a0b9b386a2f952c434e09291c08508d8e"} Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.398789 4585 generic.go:334] "Generic (PLEG): container finished" podID="b2d833c8-fb60-4668-82c2-fc2fbb186540" containerID="f059a79ad7ae1b351a183a04fcc5493eb1c65510ffceab16bdfb7ec4e77bece4" exitCode=0 Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.398868 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44gbl" event={"ID":"b2d833c8-fb60-4668-82c2-fc2fbb186540","Type":"ContainerDied","Data":"f059a79ad7ae1b351a183a04fcc5493eb1c65510ffceab16bdfb7ec4e77bece4"} Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.415028 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9lz2d" podStartSLOduration=4.763351389 podStartE2EDuration="8.415004213s" podCreationTimestamp="2026-02-15 17:11:25 +0000 UTC" firstStartedPulling="2026-02-15 17:11:29.24669897 +0000 UTC m=+345.190107102" lastFinishedPulling="2026-02-15 17:11:32.898351794 +0000 UTC m=+348.841759926" observedRunningTime="2026-02-15 17:11:33.414795437 +0000 UTC m=+349.358203569" watchObservedRunningTime="2026-02-15 17:11:33.415004213 +0000 UTC m=+349.358412345" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.416492 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dssfd" event={"ID":"57fd9f89-0b8e-4e77-b386-30107cd20578","Type":"ContainerStarted","Data":"e17a0678d791b7d3908440eaf1c6e51a7555a48abeab3515b77dcf27e9b84844"} Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.425020 4585 generic.go:334] "Generic (PLEG): container finished" podID="c570ec68-3a6e-45fc-b218-7ac142f88dda" containerID="e5be242f934087497f5dc0c4049888980a45a5d9726becce355e85709e08ec3b" exitCode=0 Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.425116 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd7nq" event={"ID":"c570ec68-3a6e-45fc-b218-7ac142f88dda","Type":"ContainerDied","Data":"e5be242f934087497f5dc0c4049888980a45a5d9726becce355e85709e08ec3b"} Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.430965 4585 generic.go:334] "Generic (PLEG): container finished" podID="886dc957-8460-4c2b-99a3-92b9d869a6cd" containerID="0427a229a795ff9ca96635adbaa9ab28eba71cc43f8d7339fb26b584b993e41f" exitCode=0 Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.431127 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jgz2x" event={"ID":"886dc957-8460-4c2b-99a3-92b9d869a6cd","Type":"ContainerDied","Data":"0427a229a795ff9ca96635adbaa9ab28eba71cc43f8d7339fb26b584b993e41f"} Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.431188 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jgz2x" event={"ID":"886dc957-8460-4c2b-99a3-92b9d869a6cd","Type":"ContainerStarted","Data":"1d472d24b2b8a2f19d1e146799244797ed0f832ffa782e1199e9a2952f826f32"} Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.437188 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gkd28" event={"ID":"c258e26a-e1d4-4c57-9808-f3befbd2aab0","Type":"ContainerStarted","Data":"f41aefc0d57795fcfc6be138eabcffbbcec48cc4360237c6d15f2e448a124b4f"} Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.482437 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dssfd" podStartSLOduration=3.951277563 podStartE2EDuration="7.482410062s" podCreationTimestamp="2026-02-15 17:11:26 +0000 UTC" firstStartedPulling="2026-02-15 17:11:29.252860391 +0000 UTC m=+345.196268523" lastFinishedPulling="2026-02-15 17:11:32.78399289 +0000 UTC m=+348.727401022" observedRunningTime="2026-02-15 17:11:33.462556054 +0000 UTC m=+349.405964186" watchObservedRunningTime="2026-02-15 17:11:33.482410062 +0000 UTC m=+349.425818194" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.564509 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rkrdk" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.708081 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.708998 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.787568 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:33 crc kubenswrapper[4585]: I0215 17:11:33.897656 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cgcn2"] Feb 15 17:11:33 crc kubenswrapper[4585]: W0215 17:11:33.902122 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod191919ef_1328_4e66_9194_d59f017c027f.slice/crio-20cd445f74e1043f425ae41a1092c8ece15bfb2bc22c7fa0e58164496d6899b8 WatchSource:0}: Error finding container 20cd445f74e1043f425ae41a1092c8ece15bfb2bc22c7fa0e58164496d6899b8: Status 404 returned error can't find the container with id 20cd445f74e1043f425ae41a1092c8ece15bfb2bc22c7fa0e58164496d6899b8 Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.157364 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7hvhv"] Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.159556 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.187027 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hvhv"] Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.313374 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123edd1b-642f-43ad-a1ca-632a99f27945-utilities\") pod \"community-operators-7hvhv\" (UID: \"123edd1b-642f-43ad-a1ca-632a99f27945\") " pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.313491 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5tkr\" (UniqueName: \"kubernetes.io/projected/123edd1b-642f-43ad-a1ca-632a99f27945-kube-api-access-k5tkr\") pod \"community-operators-7hvhv\" (UID: \"123edd1b-642f-43ad-a1ca-632a99f27945\") " pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.313528 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123edd1b-642f-43ad-a1ca-632a99f27945-catalog-content\") pod \"community-operators-7hvhv\" (UID: \"123edd1b-642f-43ad-a1ca-632a99f27945\") " pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.414510 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123edd1b-642f-43ad-a1ca-632a99f27945-catalog-content\") pod \"community-operators-7hvhv\" (UID: \"123edd1b-642f-43ad-a1ca-632a99f27945\") " pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.414592 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123edd1b-642f-43ad-a1ca-632a99f27945-utilities\") pod \"community-operators-7hvhv\" (UID: \"123edd1b-642f-43ad-a1ca-632a99f27945\") " pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.414674 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tkr\" (UniqueName: \"kubernetes.io/projected/123edd1b-642f-43ad-a1ca-632a99f27945-kube-api-access-k5tkr\") pod \"community-operators-7hvhv\" (UID: \"123edd1b-642f-43ad-a1ca-632a99f27945\") " pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.415211 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/123edd1b-642f-43ad-a1ca-632a99f27945-catalog-content\") pod \"community-operators-7hvhv\" (UID: \"123edd1b-642f-43ad-a1ca-632a99f27945\") " pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.415521 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/123edd1b-642f-43ad-a1ca-632a99f27945-utilities\") pod \"community-operators-7hvhv\" (UID: \"123edd1b-642f-43ad-a1ca-632a99f27945\") " pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.439991 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5tkr\" (UniqueName: \"kubernetes.io/projected/123edd1b-642f-43ad-a1ca-632a99f27945-kube-api-access-k5tkr\") pod \"community-operators-7hvhv\" (UID: \"123edd1b-642f-43ad-a1ca-632a99f27945\") " pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.444393 4585 generic.go:334] "Generic (PLEG): container finished" podID="c258e26a-e1d4-4c57-9808-f3befbd2aab0" containerID="f41aefc0d57795fcfc6be138eabcffbbcec48cc4360237c6d15f2e448a124b4f" exitCode=0 Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.444456 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gkd28" event={"ID":"c258e26a-e1d4-4c57-9808-f3befbd2aab0","Type":"ContainerDied","Data":"f41aefc0d57795fcfc6be138eabcffbbcec48cc4360237c6d15f2e448a124b4f"} Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.446002 4585 generic.go:334] "Generic (PLEG): container finished" podID="191919ef-1328-4e66-9194-d59f017c027f" containerID="7f2a990e361cd71289262bca59dbd1b43056e18b4df339f44e62877f37ee7b0b" exitCode=0 Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.446329 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgcn2" event={"ID":"191919ef-1328-4e66-9194-d59f017c027f","Type":"ContainerDied","Data":"7f2a990e361cd71289262bca59dbd1b43056e18b4df339f44e62877f37ee7b0b"} Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.446358 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgcn2" event={"ID":"191919ef-1328-4e66-9194-d59f017c027f","Type":"ContainerStarted","Data":"20cd445f74e1043f425ae41a1092c8ece15bfb2bc22c7fa0e58164496d6899b8"} Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.448579 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-44gbl" event={"ID":"b2d833c8-fb60-4668-82c2-fc2fbb186540","Type":"ContainerStarted","Data":"4184d0c24fd451c24fd6ecf4f9650eee04b98e595a71a304cadc2c25f40a42fc"} Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.452836 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd7nq" event={"ID":"c570ec68-3a6e-45fc-b218-7ac142f88dda","Type":"ContainerStarted","Data":"f3f830d3b95f59da89827a86aa8ae170f0ea0e09cf92575f2ef7d0debb6076bc"} Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.522427 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bj95v" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.541053 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.552103 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-44gbl" podStartSLOduration=2.7191648219999998 podStartE2EDuration="6.552085454s" podCreationTimestamp="2026-02-15 17:11:28 +0000 UTC" firstStartedPulling="2026-02-15 17:11:30.271070413 +0000 UTC m=+346.214478545" lastFinishedPulling="2026-02-15 17:11:34.103991045 +0000 UTC m=+350.047399177" observedRunningTime="2026-02-15 17:11:34.54724484 +0000 UTC m=+350.490652962" watchObservedRunningTime="2026-02-15 17:11:34.552085454 +0000 UTC m=+350.495493576" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.865887 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.866202 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:34 crc kubenswrapper[4585]: I0215 17:11:34.943302 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.093094 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hvhv"] Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.361791 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xs7w7"] Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.363002 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.376362 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xs7w7"] Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.439252 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4x8\" (UniqueName: \"kubernetes.io/projected/90aac239-5f41-437c-8bc1-a9164637e556-kube-api-access-kn4x8\") pod \"community-operators-xs7w7\" (UID: \"90aac239-5f41-437c-8bc1-a9164637e556\") " pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.440297 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90aac239-5f41-437c-8bc1-a9164637e556-catalog-content\") pod \"community-operators-xs7w7\" (UID: \"90aac239-5f41-437c-8bc1-a9164637e556\") " pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.440392 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90aac239-5f41-437c-8bc1-a9164637e556-utilities\") pod \"community-operators-xs7w7\" (UID: \"90aac239-5f41-437c-8bc1-a9164637e556\") " pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.459567 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gkd28" event={"ID":"c258e26a-e1d4-4c57-9808-f3befbd2aab0","Type":"ContainerStarted","Data":"72c169e081af154dc4ee5cc5d2cccfb8133458552b11d205c5e88b041100f895"} Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.461089 4585 generic.go:334] "Generic (PLEG): container finished" podID="123edd1b-642f-43ad-a1ca-632a99f27945" containerID="80464d65ff3a0d270b39042ffb6229760fce8020b4eb1c8628219a8dce754a10" exitCode=0 Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.461164 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hvhv" event={"ID":"123edd1b-642f-43ad-a1ca-632a99f27945","Type":"ContainerDied","Data":"80464d65ff3a0d270b39042ffb6229760fce8020b4eb1c8628219a8dce754a10"} Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.461192 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hvhv" event={"ID":"123edd1b-642f-43ad-a1ca-632a99f27945","Type":"ContainerStarted","Data":"b3f32c27a83f703fbc3cf927d7f755fe8b48a64350b279cc95b638276b17e7a7"} Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.463161 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgcn2" event={"ID":"191919ef-1328-4e66-9194-d59f017c027f","Type":"ContainerStarted","Data":"5b03ac7ba593998b6971d8e9c28cd0d19476ae9ac3feb1669f57693e4aeaeb2e"} Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.466697 4585 generic.go:334] "Generic (PLEG): container finished" podID="c570ec68-3a6e-45fc-b218-7ac142f88dda" containerID="f3f830d3b95f59da89827a86aa8ae170f0ea0e09cf92575f2ef7d0debb6076bc" exitCode=0 Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.466821 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd7nq" event={"ID":"c570ec68-3a6e-45fc-b218-7ac142f88dda","Type":"ContainerDied","Data":"f3f830d3b95f59da89827a86aa8ae170f0ea0e09cf92575f2ef7d0debb6076bc"} Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.474717 4585 generic.go:334] "Generic (PLEG): container finished" podID="886dc957-8460-4c2b-99a3-92b9d869a6cd" containerID="ef3a2f2d117bb375378ff5a23538a91b8a4ecb35130e0c8e16405dfba81e2f31" exitCode=0 Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.474946 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jgz2x" event={"ID":"886dc957-8460-4c2b-99a3-92b9d869a6cd","Type":"ContainerDied","Data":"ef3a2f2d117bb375378ff5a23538a91b8a4ecb35130e0c8e16405dfba81e2f31"} Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.483707 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gkd28" podStartSLOduration=3.717620669 podStartE2EDuration="6.483691118s" podCreationTimestamp="2026-02-15 17:11:29 +0000 UTC" firstStartedPulling="2026-02-15 17:11:32.35725131 +0000 UTC m=+348.300659442" lastFinishedPulling="2026-02-15 17:11:35.123321759 +0000 UTC m=+351.066729891" observedRunningTime="2026-02-15 17:11:35.480113999 +0000 UTC m=+351.423522141" watchObservedRunningTime="2026-02-15 17:11:35.483691118 +0000 UTC m=+351.427099240" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.530182 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-szkmh" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.541692 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4x8\" (UniqueName: \"kubernetes.io/projected/90aac239-5f41-437c-8bc1-a9164637e556-kube-api-access-kn4x8\") pod \"community-operators-xs7w7\" (UID: \"90aac239-5f41-437c-8bc1-a9164637e556\") " pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.541754 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90aac239-5f41-437c-8bc1-a9164637e556-catalog-content\") pod \"community-operators-xs7w7\" (UID: \"90aac239-5f41-437c-8bc1-a9164637e556\") " pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.541808 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90aac239-5f41-437c-8bc1-a9164637e556-utilities\") pod \"community-operators-xs7w7\" (UID: \"90aac239-5f41-437c-8bc1-a9164637e556\") " pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.542758 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90aac239-5f41-437c-8bc1-a9164637e556-utilities\") pod \"community-operators-xs7w7\" (UID: \"90aac239-5f41-437c-8bc1-a9164637e556\") " pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.543128 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90aac239-5f41-437c-8bc1-a9164637e556-catalog-content\") pod \"community-operators-xs7w7\" (UID: \"90aac239-5f41-437c-8bc1-a9164637e556\") " pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.570520 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4x8\" (UniqueName: \"kubernetes.io/projected/90aac239-5f41-437c-8bc1-a9164637e556-kube-api-access-kn4x8\") pod \"community-operators-xs7w7\" (UID: \"90aac239-5f41-437c-8bc1-a9164637e556\") " pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:35 crc kubenswrapper[4585]: I0215 17:11:35.722417 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.071377 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.071739 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.137020 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.185399 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xs7w7"] Feb 15 17:11:36 crc kubenswrapper[4585]: W0215 17:11:36.209211 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90aac239_5f41_437c_8bc1_a9164637e556.slice/crio-637ae96acd33ac9522a7fad2e5189a2aa26c031337cd1ae5fab82dbbda583275 WatchSource:0}: Error finding container 637ae96acd33ac9522a7fad2e5189a2aa26c031337cd1ae5fab82dbbda583275: Status 404 returned error can't find the container with id 637ae96acd33ac9522a7fad2e5189a2aa26c031337cd1ae5fab82dbbda583275 Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.481586 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hvhv" event={"ID":"123edd1b-642f-43ad-a1ca-632a99f27945","Type":"ContainerStarted","Data":"f27795bbea8c365f723fd9c96e671bff1543bb4fe1cb8e52fccd15de4307c52c"} Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.483715 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs7w7" event={"ID":"90aac239-5f41-437c-8bc1-a9164637e556","Type":"ContainerStarted","Data":"ef523aef65af9f953a52571e4d69451abfcec63909fa0eaecc743bc0b3118b65"} Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.483744 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs7w7" event={"ID":"90aac239-5f41-437c-8bc1-a9164637e556","Type":"ContainerStarted","Data":"637ae96acd33ac9522a7fad2e5189a2aa26c031337cd1ae5fab82dbbda583275"} Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.485619 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd7nq" event={"ID":"c570ec68-3a6e-45fc-b218-7ac142f88dda","Type":"ContainerStarted","Data":"c9550d475873204b4c99ecd78f10e4e9e5264fd52d5b6c28fc8d12e4bd4323e2"} Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.488114 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jgz2x" event={"ID":"886dc957-8460-4c2b-99a3-92b9d869a6cd","Type":"ContainerStarted","Data":"23da87aafcb2b92d15e083b444ead96ceddcb389d42e098e1a3f9aa2da459e2f"} Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.529522 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hd7nq" podStartSLOduration=3.8949548 podStartE2EDuration="6.529507681s" podCreationTimestamp="2026-02-15 17:11:30 +0000 UTC" firstStartedPulling="2026-02-15 17:11:33.427318863 +0000 UTC m=+349.370726995" lastFinishedPulling="2026-02-15 17:11:36.061871744 +0000 UTC m=+352.005279876" observedRunningTime="2026-02-15 17:11:36.522979051 +0000 UTC m=+352.466387183" watchObservedRunningTime="2026-02-15 17:11:36.529507681 +0000 UTC m=+352.472915813" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.575350 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tvwl7"] Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.576540 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.595750 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvwl7"] Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.595722 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jgz2x" podStartSLOduration=2.988407347 podStartE2EDuration="5.595691137s" podCreationTimestamp="2026-02-15 17:11:31 +0000 UTC" firstStartedPulling="2026-02-15 17:11:33.433643217 +0000 UTC m=+349.377051349" lastFinishedPulling="2026-02-15 17:11:36.040927007 +0000 UTC m=+351.984335139" observedRunningTime="2026-02-15 17:11:36.588117107 +0000 UTC m=+352.531525239" watchObservedRunningTime="2026-02-15 17:11:36.595691137 +0000 UTC m=+352.539099269" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.661054 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b92116ae-2b01-4062-ba27-ce0ad95d9b83-catalog-content\") pod \"community-operators-tvwl7\" (UID: \"b92116ae-2b01-4062-ba27-ce0ad95d9b83\") " pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.661137 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b92116ae-2b01-4062-ba27-ce0ad95d9b83-utilities\") pod \"community-operators-tvwl7\" (UID: \"b92116ae-2b01-4062-ba27-ce0ad95d9b83\") " pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.661187 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxwp2\" (UniqueName: \"kubernetes.io/projected/b92116ae-2b01-4062-ba27-ce0ad95d9b83-kube-api-access-kxwp2\") pod \"community-operators-tvwl7\" (UID: \"b92116ae-2b01-4062-ba27-ce0ad95d9b83\") " pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.762867 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b92116ae-2b01-4062-ba27-ce0ad95d9b83-catalog-content\") pod \"community-operators-tvwl7\" (UID: \"b92116ae-2b01-4062-ba27-ce0ad95d9b83\") " pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.763223 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b92116ae-2b01-4062-ba27-ce0ad95d9b83-utilities\") pod \"community-operators-tvwl7\" (UID: \"b92116ae-2b01-4062-ba27-ce0ad95d9b83\") " pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.763332 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxwp2\" (UniqueName: \"kubernetes.io/projected/b92116ae-2b01-4062-ba27-ce0ad95d9b83-kube-api-access-kxwp2\") pod \"community-operators-tvwl7\" (UID: \"b92116ae-2b01-4062-ba27-ce0ad95d9b83\") " pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.763455 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b92116ae-2b01-4062-ba27-ce0ad95d9b83-catalog-content\") pod \"community-operators-tvwl7\" (UID: \"b92116ae-2b01-4062-ba27-ce0ad95d9b83\") " pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.763570 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b92116ae-2b01-4062-ba27-ce0ad95d9b83-utilities\") pod \"community-operators-tvwl7\" (UID: \"b92116ae-2b01-4062-ba27-ce0ad95d9b83\") " pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.783326 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxwp2\" (UniqueName: \"kubernetes.io/projected/b92116ae-2b01-4062-ba27-ce0ad95d9b83-kube-api-access-kxwp2\") pod \"community-operators-tvwl7\" (UID: \"b92116ae-2b01-4062-ba27-ce0ad95d9b83\") " pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:36 crc kubenswrapper[4585]: I0215 17:11:36.939194 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.437702 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvwl7"] Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.499301 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvwl7" event={"ID":"b92116ae-2b01-4062-ba27-ce0ad95d9b83","Type":"ContainerStarted","Data":"0d5aadff8fa5533a8be6e812a4c704d71c7660aa4f20792bc453265a3ecd35a1"} Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.504936 4585 generic.go:334] "Generic (PLEG): container finished" podID="191919ef-1328-4e66-9194-d59f017c027f" containerID="5b03ac7ba593998b6971d8e9c28cd0d19476ae9ac3feb1669f57693e4aeaeb2e" exitCode=0 Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.505107 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgcn2" event={"ID":"191919ef-1328-4e66-9194-d59f017c027f","Type":"ContainerDied","Data":"5b03ac7ba593998b6971d8e9c28cd0d19476ae9ac3feb1669f57693e4aeaeb2e"} Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.691446 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.691496 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.751698 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8c7xf"] Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.752971 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.764303 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8c7xf"] Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.769896 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.780226 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5v7\" (UniqueName: \"kubernetes.io/projected/e1ff32f1-5c8f-4062-b6f6-5fb81667a532-kube-api-access-rv5v7\") pod \"community-operators-8c7xf\" (UID: \"e1ff32f1-5c8f-4062-b6f6-5fb81667a532\") " pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.780272 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1ff32f1-5c8f-4062-b6f6-5fb81667a532-utilities\") pod \"community-operators-8c7xf\" (UID: \"e1ff32f1-5c8f-4062-b6f6-5fb81667a532\") " pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.780307 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1ff32f1-5c8f-4062-b6f6-5fb81667a532-catalog-content\") pod \"community-operators-8c7xf\" (UID: \"e1ff32f1-5c8f-4062-b6f6-5fb81667a532\") " pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.881445 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1ff32f1-5c8f-4062-b6f6-5fb81667a532-catalog-content\") pod \"community-operators-8c7xf\" (UID: \"e1ff32f1-5c8f-4062-b6f6-5fb81667a532\") " pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.881552 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5v7\" (UniqueName: \"kubernetes.io/projected/e1ff32f1-5c8f-4062-b6f6-5fb81667a532-kube-api-access-rv5v7\") pod \"community-operators-8c7xf\" (UID: \"e1ff32f1-5c8f-4062-b6f6-5fb81667a532\") " pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.881574 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1ff32f1-5c8f-4062-b6f6-5fb81667a532-utilities\") pod \"community-operators-8c7xf\" (UID: \"e1ff32f1-5c8f-4062-b6f6-5fb81667a532\") " pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.881972 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1ff32f1-5c8f-4062-b6f6-5fb81667a532-utilities\") pod \"community-operators-8c7xf\" (UID: \"e1ff32f1-5c8f-4062-b6f6-5fb81667a532\") " pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.882436 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1ff32f1-5c8f-4062-b6f6-5fb81667a532-catalog-content\") pod \"community-operators-8c7xf\" (UID: \"e1ff32f1-5c8f-4062-b6f6-5fb81667a532\") " pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:37 crc kubenswrapper[4585]: I0215 17:11:37.905569 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5v7\" (UniqueName: \"kubernetes.io/projected/e1ff32f1-5c8f-4062-b6f6-5fb81667a532-kube-api-access-rv5v7\") pod \"community-operators-8c7xf\" (UID: \"e1ff32f1-5c8f-4062-b6f6-5fb81667a532\") " pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.068244 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.506365 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.506645 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.512839 4585 generic.go:334] "Generic (PLEG): container finished" podID="b92116ae-2b01-4062-ba27-ce0ad95d9b83" containerID="ab689882f341dfb4b93f2682e052cd8dd8cf5e67402cc7a5985fbadf41f08c5b" exitCode=0 Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.512902 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvwl7" event={"ID":"b92116ae-2b01-4062-ba27-ce0ad95d9b83","Type":"ContainerDied","Data":"ab689882f341dfb4b93f2682e052cd8dd8cf5e67402cc7a5985fbadf41f08c5b"} Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.514942 4585 generic.go:334] "Generic (PLEG): container finished" podID="123edd1b-642f-43ad-a1ca-632a99f27945" containerID="f27795bbea8c365f723fd9c96e671bff1543bb4fe1cb8e52fccd15de4307c52c" exitCode=0 Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.514985 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hvhv" event={"ID":"123edd1b-642f-43ad-a1ca-632a99f27945","Type":"ContainerDied","Data":"f27795bbea8c365f723fd9c96e671bff1543bb4fe1cb8e52fccd15de4307c52c"} Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.516586 4585 generic.go:334] "Generic (PLEG): container finished" podID="90aac239-5f41-437c-8bc1-a9164637e556" containerID="ef523aef65af9f953a52571e4d69451abfcec63909fa0eaecc743bc0b3118b65" exitCode=0 Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.517117 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs7w7" event={"ID":"90aac239-5f41-437c-8bc1-a9164637e556","Type":"ContainerDied","Data":"ef523aef65af9f953a52571e4d69451abfcec63909fa0eaecc743bc0b3118b65"} Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.583541 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dssfd" Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.627695 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8c7xf"] Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.652272 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.967009 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nxp97"] Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.971399 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.977233 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxp97"] Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.995573 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5cad97d-c68c-4859-bc2b-746461b8b361-utilities\") pod \"community-operators-nxp97\" (UID: \"f5cad97d-c68c-4859-bc2b-746461b8b361\") " pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.995653 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5cad97d-c68c-4859-bc2b-746461b8b361-catalog-content\") pod \"community-operators-nxp97\" (UID: \"f5cad97d-c68c-4859-bc2b-746461b8b361\") " pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:38 crc kubenswrapper[4585]: I0215 17:11:38.995721 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqxbg\" (UniqueName: \"kubernetes.io/projected/f5cad97d-c68c-4859-bc2b-746461b8b361-kube-api-access-bqxbg\") pod \"community-operators-nxp97\" (UID: \"f5cad97d-c68c-4859-bc2b-746461b8b361\") " pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:39 crc kubenswrapper[4585]: E0215 17:11:39.065653 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1ff32f1_5c8f_4062_b6f6_5fb81667a532.slice/crio-conmon-fd0e92259054818a395a1b6feabad5234583308fb240b42e610f827a80e56001.scope\": RecentStats: unable to find data in memory cache]" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.096464 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5cad97d-c68c-4859-bc2b-746461b8b361-catalog-content\") pod \"community-operators-nxp97\" (UID: \"f5cad97d-c68c-4859-bc2b-746461b8b361\") " pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.096552 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqxbg\" (UniqueName: \"kubernetes.io/projected/f5cad97d-c68c-4859-bc2b-746461b8b361-kube-api-access-bqxbg\") pod \"community-operators-nxp97\" (UID: \"f5cad97d-c68c-4859-bc2b-746461b8b361\") " pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.096610 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5cad97d-c68c-4859-bc2b-746461b8b361-utilities\") pod \"community-operators-nxp97\" (UID: \"f5cad97d-c68c-4859-bc2b-746461b8b361\") " pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.097021 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5cad97d-c68c-4859-bc2b-746461b8b361-utilities\") pod \"community-operators-nxp97\" (UID: \"f5cad97d-c68c-4859-bc2b-746461b8b361\") " pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.097187 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5cad97d-c68c-4859-bc2b-746461b8b361-catalog-content\") pod \"community-operators-nxp97\" (UID: \"f5cad97d-c68c-4859-bc2b-746461b8b361\") " pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.134590 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqxbg\" (UniqueName: \"kubernetes.io/projected/f5cad97d-c68c-4859-bc2b-746461b8b361-kube-api-access-bqxbg\") pod \"community-operators-nxp97\" (UID: \"f5cad97d-c68c-4859-bc2b-746461b8b361\") " pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.352023 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.535486 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hvhv" event={"ID":"123edd1b-642f-43ad-a1ca-632a99f27945","Type":"ContainerStarted","Data":"df08eeeff9d5b4642dcd1a171c5f96b8218e96efd2dd152a32a3d1c9017b0328"} Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.542076 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgcn2" event={"ID":"191919ef-1328-4e66-9194-d59f017c027f","Type":"ContainerStarted","Data":"564b97ec40a302a6b7fc78e372bf9fa8c25d47fdda5ae429e6d8138b415085ca"} Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.563354 4585 generic.go:334] "Generic (PLEG): container finished" podID="e1ff32f1-5c8f-4062-b6f6-5fb81667a532" containerID="fd0e92259054818a395a1b6feabad5234583308fb240b42e610f827a80e56001" exitCode=0 Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.563440 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7xf" event={"ID":"e1ff32f1-5c8f-4062-b6f6-5fb81667a532","Type":"ContainerDied","Data":"fd0e92259054818a395a1b6feabad5234583308fb240b42e610f827a80e56001"} Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.563466 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7xf" event={"ID":"e1ff32f1-5c8f-4062-b6f6-5fb81667a532","Type":"ContainerStarted","Data":"c74069685ff2587915deb66229df06d7a120cdf27051fea6cd7a13e1c8511b48"} Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.566839 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7hvhv" podStartSLOduration=2.109651512 podStartE2EDuration="5.566828722s" podCreationTimestamp="2026-02-15 17:11:34 +0000 UTC" firstStartedPulling="2026-02-15 17:11:35.462996607 +0000 UTC m=+351.406404739" lastFinishedPulling="2026-02-15 17:11:38.920173827 +0000 UTC m=+354.863581949" observedRunningTime="2026-02-15 17:11:39.557886175 +0000 UTC m=+355.501294327" watchObservedRunningTime="2026-02-15 17:11:39.566828722 +0000 UTC m=+355.510236854" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.589662 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cgcn2" podStartSLOduration=3.560210998 podStartE2EDuration="7.589640361s" podCreationTimestamp="2026-02-15 17:11:32 +0000 UTC" firstStartedPulling="2026-02-15 17:11:34.452651552 +0000 UTC m=+350.396059684" lastFinishedPulling="2026-02-15 17:11:38.482080925 +0000 UTC m=+354.425489047" observedRunningTime="2026-02-15 17:11:39.583332327 +0000 UTC m=+355.526740459" watchObservedRunningTime="2026-02-15 17:11:39.589640361 +0000 UTC m=+355.533048503" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.679271 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-44gbl" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.723262 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.723309 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.823010 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:39 crc kubenswrapper[4585]: I0215 17:11:39.872716 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nxp97"] Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.158051 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85w5q"] Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.159256 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.167222 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85w5q"] Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.215439 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b628bbea-ba4e-447d-a27e-22a56cea0bfc-catalog-content\") pod \"community-operators-85w5q\" (UID: \"b628bbea-ba4e-447d-a27e-22a56cea0bfc\") " pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.215505 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcnx6\" (UniqueName: \"kubernetes.io/projected/b628bbea-ba4e-447d-a27e-22a56cea0bfc-kube-api-access-zcnx6\") pod \"community-operators-85w5q\" (UID: \"b628bbea-ba4e-447d-a27e-22a56cea0bfc\") " pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.215542 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b628bbea-ba4e-447d-a27e-22a56cea0bfc-utilities\") pod \"community-operators-85w5q\" (UID: \"b628bbea-ba4e-447d-a27e-22a56cea0bfc\") " pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.317082 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b628bbea-ba4e-447d-a27e-22a56cea0bfc-catalog-content\") pod \"community-operators-85w5q\" (UID: \"b628bbea-ba4e-447d-a27e-22a56cea0bfc\") " pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.317493 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcnx6\" (UniqueName: \"kubernetes.io/projected/b628bbea-ba4e-447d-a27e-22a56cea0bfc-kube-api-access-zcnx6\") pod \"community-operators-85w5q\" (UID: \"b628bbea-ba4e-447d-a27e-22a56cea0bfc\") " pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.317530 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b628bbea-ba4e-447d-a27e-22a56cea0bfc-utilities\") pod \"community-operators-85w5q\" (UID: \"b628bbea-ba4e-447d-a27e-22a56cea0bfc\") " pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.317786 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b628bbea-ba4e-447d-a27e-22a56cea0bfc-catalog-content\") pod \"community-operators-85w5q\" (UID: \"b628bbea-ba4e-447d-a27e-22a56cea0bfc\") " pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.318054 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b628bbea-ba4e-447d-a27e-22a56cea0bfc-utilities\") pod \"community-operators-85w5q\" (UID: \"b628bbea-ba4e-447d-a27e-22a56cea0bfc\") " pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.346790 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcnx6\" (UniqueName: \"kubernetes.io/projected/b628bbea-ba4e-447d-a27e-22a56cea0bfc-kube-api-access-zcnx6\") pod \"community-operators-85w5q\" (UID: \"b628bbea-ba4e-447d-a27e-22a56cea0bfc\") " pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.488804 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.625628 4585 generic.go:334] "Generic (PLEG): container finished" podID="90aac239-5f41-437c-8bc1-a9164637e556" containerID="71131a34fbb1f727d4e01b189724b601d4c305ec234a4e42196d8ec14c7a9854" exitCode=0 Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.625958 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs7w7" event={"ID":"90aac239-5f41-437c-8bc1-a9164637e556","Type":"ContainerDied","Data":"71131a34fbb1f727d4e01b189724b601d4c305ec234a4e42196d8ec14c7a9854"} Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.633110 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7xf" event={"ID":"e1ff32f1-5c8f-4062-b6f6-5fb81667a532","Type":"ContainerStarted","Data":"665fe3afacd0fbcc8710ab731e88adcad17861446ffd46d40da4a40ad8b95c1e"} Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.649187 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxp97" event={"ID":"f5cad97d-c68c-4859-bc2b-746461b8b361","Type":"ContainerStarted","Data":"c1624caeb134aca7e01c6b8b85ea40bda6c4cd659979fee5f01afa3f2bbc8b23"} Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.649231 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxp97" event={"ID":"f5cad97d-c68c-4859-bc2b-746461b8b361","Type":"ContainerStarted","Data":"ae087c8f135ca4ec6b0180e4cfc2dadec58afa440ec6e6d34423c499b2411e1c"} Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.655309 4585 generic.go:334] "Generic (PLEG): container finished" podID="b92116ae-2b01-4062-ba27-ce0ad95d9b83" containerID="fd4be813cdf02d4ac3065216bcc1c6b20729b848c174b224f0a5091c10e22cf8" exitCode=0 Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.656242 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvwl7" event={"ID":"b92116ae-2b01-4062-ba27-ce0ad95d9b83","Type":"ContainerDied","Data":"fd4be813cdf02d4ac3065216bcc1c6b20729b848c174b224f0a5091c10e22cf8"} Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.690940 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-685gs" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.744952 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gkd28" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.758120 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-st5w4"] Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.888428 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.888479 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.935107 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:40 crc kubenswrapper[4585]: I0215 17:11:40.944778 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85w5q"] Feb 15 17:11:40 crc kubenswrapper[4585]: W0215 17:11:40.960531 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb628bbea_ba4e_447d_a27e_22a56cea0bfc.slice/crio-05f4818c1a3b9f2bafe758f02fcba5eed625304a8e22bba6d06ad3e6f7faf79b WatchSource:0}: Error finding container 05f4818c1a3b9f2bafe758f02fcba5eed625304a8e22bba6d06ad3e6f7faf79b: Status 404 returned error can't find the container with id 05f4818c1a3b9f2bafe758f02fcba5eed625304a8e22bba6d06ad3e6f7faf79b Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.360670 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5fsfv"] Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.362325 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.368330 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5fsfv"] Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.439340 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqrk\" (UniqueName: \"kubernetes.io/projected/ac617faa-373e-4ae0-8fe0-bbecee411a35-kube-api-access-4wqrk\") pod \"community-operators-5fsfv\" (UID: \"ac617faa-373e-4ae0-8fe0-bbecee411a35\") " pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.439432 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac617faa-373e-4ae0-8fe0-bbecee411a35-utilities\") pod \"community-operators-5fsfv\" (UID: \"ac617faa-373e-4ae0-8fe0-bbecee411a35\") " pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.439455 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac617faa-373e-4ae0-8fe0-bbecee411a35-catalog-content\") pod \"community-operators-5fsfv\" (UID: \"ac617faa-373e-4ae0-8fe0-bbecee411a35\") " pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.540510 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac617faa-373e-4ae0-8fe0-bbecee411a35-utilities\") pod \"community-operators-5fsfv\" (UID: \"ac617faa-373e-4ae0-8fe0-bbecee411a35\") " pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.540553 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac617faa-373e-4ae0-8fe0-bbecee411a35-catalog-content\") pod \"community-operators-5fsfv\" (UID: \"ac617faa-373e-4ae0-8fe0-bbecee411a35\") " pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.540632 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqrk\" (UniqueName: \"kubernetes.io/projected/ac617faa-373e-4ae0-8fe0-bbecee411a35-kube-api-access-4wqrk\") pod \"community-operators-5fsfv\" (UID: \"ac617faa-373e-4ae0-8fe0-bbecee411a35\") " pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.541293 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac617faa-373e-4ae0-8fe0-bbecee411a35-catalog-content\") pod \"community-operators-5fsfv\" (UID: \"ac617faa-373e-4ae0-8fe0-bbecee411a35\") " pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.541434 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac617faa-373e-4ae0-8fe0-bbecee411a35-utilities\") pod \"community-operators-5fsfv\" (UID: \"ac617faa-373e-4ae0-8fe0-bbecee411a35\") " pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.568931 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqrk\" (UniqueName: \"kubernetes.io/projected/ac617faa-373e-4ae0-8fe0-bbecee411a35-kube-api-access-4wqrk\") pod \"community-operators-5fsfv\" (UID: \"ac617faa-373e-4ae0-8fe0-bbecee411a35\") " pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.655489 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64654fc454-wclbx"] Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.655686 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" podUID="b74d4122-7aa3-4eed-9991-4c19067b911e" containerName="controller-manager" containerID="cri-o://354df0492d1e475974568ba6a55f64271f2cf8f53da08a19b90a0a2d03de2f4c" gracePeriod=30 Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.664464 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvwl7" event={"ID":"b92116ae-2b01-4062-ba27-ce0ad95d9b83","Type":"ContainerStarted","Data":"e1ed6325c4074ca2ac04380c0f5f5b2edc9cf748dbc6e2ff2869af42f55f6ea3"} Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.666644 4585 generic.go:334] "Generic (PLEG): container finished" podID="b628bbea-ba4e-447d-a27e-22a56cea0bfc" containerID="69a41cc84b2e98d90a6d7fffa03f5fbef2ed1c139a3d1c6f6a0e389487ff29b8" exitCode=0 Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.666698 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85w5q" event={"ID":"b628bbea-ba4e-447d-a27e-22a56cea0bfc","Type":"ContainerDied","Data":"69a41cc84b2e98d90a6d7fffa03f5fbef2ed1c139a3d1c6f6a0e389487ff29b8"} Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.666713 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85w5q" event={"ID":"b628bbea-ba4e-447d-a27e-22a56cea0bfc","Type":"ContainerStarted","Data":"05f4818c1a3b9f2bafe758f02fcba5eed625304a8e22bba6d06ad3e6f7faf79b"} Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.669503 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xs7w7" event={"ID":"90aac239-5f41-437c-8bc1-a9164637e556","Type":"ContainerStarted","Data":"7f5d93b40973bf894a2801ebbade62d2f1a519a5d1767b2f017d51a3ab3662d4"} Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.672091 4585 generic.go:334] "Generic (PLEG): container finished" podID="e1ff32f1-5c8f-4062-b6f6-5fb81667a532" containerID="665fe3afacd0fbcc8710ab731e88adcad17861446ffd46d40da4a40ad8b95c1e" exitCode=0 Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.672149 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7xf" event={"ID":"e1ff32f1-5c8f-4062-b6f6-5fb81667a532","Type":"ContainerDied","Data":"665fe3afacd0fbcc8710ab731e88adcad17861446ffd46d40da4a40ad8b95c1e"} Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.674080 4585 generic.go:334] "Generic (PLEG): container finished" podID="f5cad97d-c68c-4859-bc2b-746461b8b361" containerID="c1624caeb134aca7e01c6b8b85ea40bda6c4cd659979fee5f01afa3f2bbc8b23" exitCode=0 Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.674127 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxp97" event={"ID":"f5cad97d-c68c-4859-bc2b-746461b8b361","Type":"ContainerDied","Data":"c1624caeb134aca7e01c6b8b85ea40bda6c4cd659979fee5f01afa3f2bbc8b23"} Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.681579 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.704242 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tvwl7" podStartSLOduration=3.003959798 podStartE2EDuration="5.704221392s" podCreationTimestamp="2026-02-15 17:11:36 +0000 UTC" firstStartedPulling="2026-02-15 17:11:38.535617721 +0000 UTC m=+354.479025843" lastFinishedPulling="2026-02-15 17:11:41.235879305 +0000 UTC m=+357.179287437" observedRunningTime="2026-02-15 17:11:41.702801173 +0000 UTC m=+357.646209305" watchObservedRunningTime="2026-02-15 17:11:41.704221392 +0000 UTC m=+357.647629524" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.743802 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hd7nq" Feb 15 17:11:41 crc kubenswrapper[4585]: I0215 17:11:41.829231 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xs7w7" podStartSLOduration=3.9742428690000002 podStartE2EDuration="6.82921196s" podCreationTimestamp="2026-02-15 17:11:35 +0000 UTC" firstStartedPulling="2026-02-15 17:11:38.535616001 +0000 UTC m=+354.479024133" lastFinishedPulling="2026-02-15 17:11:41.390585092 +0000 UTC m=+357.333993224" observedRunningTime="2026-02-15 17:11:41.828740237 +0000 UTC m=+357.772148389" watchObservedRunningTime="2026-02-15 17:11:41.82921196 +0000 UTC m=+357.772620092" Feb 15 17:11:42 crc kubenswrapper[4585]: I0215 17:11:42.115688 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:42 crc kubenswrapper[4585]: I0215 17:11:42.115983 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:42 crc kubenswrapper[4585]: I0215 17:11:42.158755 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:42 crc kubenswrapper[4585]: I0215 17:11:42.263958 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5fsfv"] Feb 15 17:11:42 crc kubenswrapper[4585]: W0215 17:11:42.264699 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac617faa_373e_4ae0_8fe0_bbecee411a35.slice/crio-80ab4f11964ecbfe54cc63381234231aaf29f4db914b53f97e610bfe4cf762e8 WatchSource:0}: Error finding container 80ab4f11964ecbfe54cc63381234231aaf29f4db914b53f97e610bfe4cf762e8: Status 404 returned error can't find the container with id 80ab4f11964ecbfe54cc63381234231aaf29f4db914b53f97e610bfe4cf762e8 Feb 15 17:11:42 crc kubenswrapper[4585]: I0215 17:11:42.685979 4585 generic.go:334] "Generic (PLEG): container finished" podID="b74d4122-7aa3-4eed-9991-4c19067b911e" containerID="354df0492d1e475974568ba6a55f64271f2cf8f53da08a19b90a0a2d03de2f4c" exitCode=0 Feb 15 17:11:42 crc kubenswrapper[4585]: I0215 17:11:42.686137 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" event={"ID":"b74d4122-7aa3-4eed-9991-4c19067b911e","Type":"ContainerDied","Data":"354df0492d1e475974568ba6a55f64271f2cf8f53da08a19b90a0a2d03de2f4c"} Feb 15 17:11:42 crc kubenswrapper[4585]: I0215 17:11:42.687881 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fsfv" event={"ID":"ac617faa-373e-4ae0-8fe0-bbecee411a35","Type":"ContainerStarted","Data":"80ab4f11964ecbfe54cc63381234231aaf29f4db914b53f97e610bfe4cf762e8"} Feb 15 17:11:42 crc kubenswrapper[4585]: I0215 17:11:42.763072 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jgz2x" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.310624 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.333929 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.333970 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.357246 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-595795c778-925lq"] Feb 15 17:11:43 crc kubenswrapper[4585]: E0215 17:11:43.357871 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74d4122-7aa3-4eed-9991-4c19067b911e" containerName="controller-manager" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.357888 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74d4122-7aa3-4eed-9991-4c19067b911e" containerName="controller-manager" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.358018 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74d4122-7aa3-4eed-9991-4c19067b911e" containerName="controller-manager" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.358467 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.403310 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960631a4-c13f-47ac-b5f3-02da156908e1-config\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.409378 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-595795c778-925lq"] Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.410078 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504114 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-client-ca\") pod \"b74d4122-7aa3-4eed-9991-4c19067b911e\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504170 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b74d4122-7aa3-4eed-9991-4c19067b911e-serving-cert\") pod \"b74d4122-7aa3-4eed-9991-4c19067b911e\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504234 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-proxy-ca-bundles\") pod \"b74d4122-7aa3-4eed-9991-4c19067b911e\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504310 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-config\") pod \"b74d4122-7aa3-4eed-9991-4c19067b911e\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504371 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlkbv\" (UniqueName: \"kubernetes.io/projected/b74d4122-7aa3-4eed-9991-4c19067b911e-kube-api-access-hlkbv\") pod \"b74d4122-7aa3-4eed-9991-4c19067b911e\" (UID: \"b74d4122-7aa3-4eed-9991-4c19067b911e\") " Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504553 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960631a4-c13f-47ac-b5f3-02da156908e1-config\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504615 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/960631a4-c13f-47ac-b5f3-02da156908e1-proxy-ca-bundles\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504640 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/960631a4-c13f-47ac-b5f3-02da156908e1-client-ca\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504658 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960631a4-c13f-47ac-b5f3-02da156908e1-serving-cert\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504683 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ltp6\" (UniqueName: \"kubernetes.io/projected/960631a4-c13f-47ac-b5f3-02da156908e1-kube-api-access-8ltp6\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.504946 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-client-ca" (OuterVolumeSpecName: "client-ca") pod "b74d4122-7aa3-4eed-9991-4c19067b911e" (UID: "b74d4122-7aa3-4eed-9991-4c19067b911e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.505010 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b74d4122-7aa3-4eed-9991-4c19067b911e" (UID: "b74d4122-7aa3-4eed-9991-4c19067b911e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.505220 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-config" (OuterVolumeSpecName: "config") pod "b74d4122-7aa3-4eed-9991-4c19067b911e" (UID: "b74d4122-7aa3-4eed-9991-4c19067b911e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.506143 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960631a4-c13f-47ac-b5f3-02da156908e1-config\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.510268 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74d4122-7aa3-4eed-9991-4c19067b911e-kube-api-access-hlkbv" (OuterVolumeSpecName: "kube-api-access-hlkbv") pod "b74d4122-7aa3-4eed-9991-4c19067b911e" (UID: "b74d4122-7aa3-4eed-9991-4c19067b911e"). InnerVolumeSpecName "kube-api-access-hlkbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.512083 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b74d4122-7aa3-4eed-9991-4c19067b911e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b74d4122-7aa3-4eed-9991-4c19067b911e" (UID: "b74d4122-7aa3-4eed-9991-4c19067b911e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.605670 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/960631a4-c13f-47ac-b5f3-02da156908e1-proxy-ca-bundles\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.606967 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/960631a4-c13f-47ac-b5f3-02da156908e1-proxy-ca-bundles\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.606964 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/960631a4-c13f-47ac-b5f3-02da156908e1-client-ca\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.607027 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960631a4-c13f-47ac-b5f3-02da156908e1-serving-cert\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.607059 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ltp6\" (UniqueName: \"kubernetes.io/projected/960631a4-c13f-47ac-b5f3-02da156908e1-kube-api-access-8ltp6\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.607157 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.607172 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b74d4122-7aa3-4eed-9991-4c19067b911e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.607182 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.607193 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74d4122-7aa3-4eed-9991-4c19067b911e-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.607202 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlkbv\" (UniqueName: \"kubernetes.io/projected/b74d4122-7aa3-4eed-9991-4c19067b911e-kube-api-access-hlkbv\") on node \"crc\" DevicePath \"\"" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.607929 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/960631a4-c13f-47ac-b5f3-02da156908e1-client-ca\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.610372 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960631a4-c13f-47ac-b5f3-02da156908e1-serving-cert\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.625767 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ltp6\" (UniqueName: \"kubernetes.io/projected/960631a4-c13f-47ac-b5f3-02da156908e1-kube-api-access-8ltp6\") pod \"controller-manager-595795c778-925lq\" (UID: \"960631a4-c13f-47ac-b5f3-02da156908e1\") " pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.678876 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.701029 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c7xf" event={"ID":"e1ff32f1-5c8f-4062-b6f6-5fb81667a532","Type":"ContainerStarted","Data":"d20d8435e409d49806b8f3a2a5da84afd67a007f0a1e1ea295c8b19a6d76699c"} Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.728631 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8c7xf" podStartSLOduration=3.601483947 podStartE2EDuration="6.728595125s" podCreationTimestamp="2026-02-15 17:11:37 +0000 UTC" firstStartedPulling="2026-02-15 17:11:39.57293966 +0000 UTC m=+355.516347792" lastFinishedPulling="2026-02-15 17:11:42.700050838 +0000 UTC m=+358.643458970" observedRunningTime="2026-02-15 17:11:43.727266548 +0000 UTC m=+359.670674680" watchObservedRunningTime="2026-02-15 17:11:43.728595125 +0000 UTC m=+359.672003257" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.734306 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxp97" event={"ID":"f5cad97d-c68c-4859-bc2b-746461b8b361","Type":"ContainerStarted","Data":"988c22a48f3e1aebf046cc3249f7a66e80dd40638b93c8cba7a0d76e6aadbf76"} Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.750026 4585 generic.go:334] "Generic (PLEG): container finished" podID="ac617faa-373e-4ae0-8fe0-bbecee411a35" containerID="e503de7bc7df74b5524ccbe1b9965348e8e33205f61a85423dc2b56678200645" exitCode=0 Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.750085 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fsfv" event={"ID":"ac617faa-373e-4ae0-8fe0-bbecee411a35","Type":"ContainerDied","Data":"e503de7bc7df74b5524ccbe1b9965348e8e33205f61a85423dc2b56678200645"} Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.763825 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" event={"ID":"b74d4122-7aa3-4eed-9991-4c19067b911e","Type":"ContainerDied","Data":"f5b0949af51fbcfe740d6e629601b49f469e1d70065f5fb04d2ad056ecff9544"} Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.763873 4585 scope.go:117] "RemoveContainer" containerID="354df0492d1e475974568ba6a55f64271f2cf8f53da08a19b90a0a2d03de2f4c" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.763997 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64654fc454-wclbx" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.782118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85w5q" event={"ID":"b628bbea-ba4e-447d-a27e-22a56cea0bfc","Type":"ContainerStarted","Data":"3a4b8d28fbd55627e83e6b0e60f99933faefb6cb284a67ef31b86d9bb80898cf"} Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.865583 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64654fc454-wclbx"] Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.868621 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cgcn2" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.879278 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64654fc454-wclbx"] Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.951647 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mcd69"] Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.952936 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:43 crc kubenswrapper[4585]: I0215 17:11:43.976071 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mcd69"] Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.035084 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdtnr\" (UniqueName: \"kubernetes.io/projected/669fa704-8e84-4c96-abf3-11f606645785-kube-api-access-rdtnr\") pod \"community-operators-mcd69\" (UID: \"669fa704-8e84-4c96-abf3-11f606645785\") " pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.035157 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669fa704-8e84-4c96-abf3-11f606645785-utilities\") pod \"community-operators-mcd69\" (UID: \"669fa704-8e84-4c96-abf3-11f606645785\") " pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.035177 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669fa704-8e84-4c96-abf3-11f606645785-catalog-content\") pod \"community-operators-mcd69\" (UID: \"669fa704-8e84-4c96-abf3-11f606645785\") " pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.135721 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdtnr\" (UniqueName: \"kubernetes.io/projected/669fa704-8e84-4c96-abf3-11f606645785-kube-api-access-rdtnr\") pod \"community-operators-mcd69\" (UID: \"669fa704-8e84-4c96-abf3-11f606645785\") " pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.135781 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669fa704-8e84-4c96-abf3-11f606645785-utilities\") pod \"community-operators-mcd69\" (UID: \"669fa704-8e84-4c96-abf3-11f606645785\") " pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.135801 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669fa704-8e84-4c96-abf3-11f606645785-catalog-content\") pod \"community-operators-mcd69\" (UID: \"669fa704-8e84-4c96-abf3-11f606645785\") " pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.136234 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669fa704-8e84-4c96-abf3-11f606645785-catalog-content\") pod \"community-operators-mcd69\" (UID: \"669fa704-8e84-4c96-abf3-11f606645785\") " pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.136689 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669fa704-8e84-4c96-abf3-11f606645785-utilities\") pod \"community-operators-mcd69\" (UID: \"669fa704-8e84-4c96-abf3-11f606645785\") " pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.155337 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdtnr\" (UniqueName: \"kubernetes.io/projected/669fa704-8e84-4c96-abf3-11f606645785-kube-api-access-rdtnr\") pod \"community-operators-mcd69\" (UID: \"669fa704-8e84-4c96-abf3-11f606645785\") " pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.192732 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-595795c778-925lq"] Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.276131 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.544463 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.544722 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.638222 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.771657 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mcd69"] Feb 15 17:11:44 crc kubenswrapper[4585]: I0215 17:11:44.954400 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74d4122-7aa3-4eed-9991-4c19067b911e" path="/var/lib/kubelet/pods/b74d4122-7aa3-4eed-9991-4c19067b911e/volumes" Feb 15 17:11:45 crc kubenswrapper[4585]: I0215 17:11:45.000285 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fsfv" event={"ID":"ac617faa-373e-4ae0-8fe0-bbecee411a35","Type":"ContainerStarted","Data":"e91d183305557071774aadc1952b6cf5258038c7cfa8b32f026dd77d61ddc757"} Feb 15 17:11:45 crc kubenswrapper[4585]: I0215 17:11:45.025035 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcd69" event={"ID":"669fa704-8e84-4c96-abf3-11f606645785","Type":"ContainerStarted","Data":"a8558bd1a2671c1e82af5cc683e7e7f465ebc14d91019d4a9564d104647d6069"} Feb 15 17:11:45 crc kubenswrapper[4585]: I0215 17:11:45.080417 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-595795c778-925lq" event={"ID":"960631a4-c13f-47ac-b5f3-02da156908e1","Type":"ContainerStarted","Data":"e522a768121ca6889cbb9360a4d1bb2a0d7652a8bc09d70e711d1b4a066a5758"} Feb 15 17:11:45 crc kubenswrapper[4585]: I0215 17:11:45.080473 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-595795c778-925lq" event={"ID":"960631a4-c13f-47ac-b5f3-02da156908e1","Type":"ContainerStarted","Data":"95dc357d41e033c4daef32842c2127fe0caa459a213dc66a4ac00943ca49cda1"} Feb 15 17:11:45 crc kubenswrapper[4585]: I0215 17:11:45.122633 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-595795c778-925lq" podStartSLOduration=4.122615463 podStartE2EDuration="4.122615463s" podCreationTimestamp="2026-02-15 17:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:11:45.122230871 +0000 UTC m=+361.065639003" watchObservedRunningTime="2026-02-15 17:11:45.122615463 +0000 UTC m=+361.066023595" Feb 15 17:11:45 crc kubenswrapper[4585]: I0215 17:11:45.262652 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7hvhv" Feb 15 17:11:45 crc kubenswrapper[4585]: I0215 17:11:45.723203 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:45 crc kubenswrapper[4585]: I0215 17:11:45.723262 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:45 crc kubenswrapper[4585]: I0215 17:11:45.773965 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.086095 4585 generic.go:334] "Generic (PLEG): container finished" podID="f5cad97d-c68c-4859-bc2b-746461b8b361" containerID="988c22a48f3e1aebf046cc3249f7a66e80dd40638b93c8cba7a0d76e6aadbf76" exitCode=0 Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.086190 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxp97" event={"ID":"f5cad97d-c68c-4859-bc2b-746461b8b361","Type":"ContainerDied","Data":"988c22a48f3e1aebf046cc3249f7a66e80dd40638b93c8cba7a0d76e6aadbf76"} Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.090314 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcd69" event={"ID":"669fa704-8e84-4c96-abf3-11f606645785","Type":"ContainerStarted","Data":"04ead3ff2e6759b490956e8598825a59d3cbe87d4b347b56323b9106a2613a3a"} Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.094722 4585 generic.go:334] "Generic (PLEG): container finished" podID="b628bbea-ba4e-447d-a27e-22a56cea0bfc" containerID="3a4b8d28fbd55627e83e6b0e60f99933faefb6cb284a67ef31b86d9bb80898cf" exitCode=0 Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.094918 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85w5q" event={"ID":"b628bbea-ba4e-447d-a27e-22a56cea0bfc","Type":"ContainerDied","Data":"3a4b8d28fbd55627e83e6b0e60f99933faefb6cb284a67ef31b86d9bb80898cf"} Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.096486 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.102776 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-595795c778-925lq" Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.145642 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9lz2d" Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.175292 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xs7w7" Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.940191 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.940678 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:46 crc kubenswrapper[4585]: I0215 17:11:46.992014 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:47 crc kubenswrapper[4585]: I0215 17:11:47.013700 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:11:47 crc kubenswrapper[4585]: I0215 17:11:47.013746 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:11:47 crc kubenswrapper[4585]: I0215 17:11:47.106263 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85w5q" event={"ID":"b628bbea-ba4e-447d-a27e-22a56cea0bfc","Type":"ContainerStarted","Data":"d80c13f2b5ab4e388a59dff00fbc802a8fdfdf5750f013448573d521e2488f50"} Feb 15 17:11:47 crc kubenswrapper[4585]: I0215 17:11:47.109169 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nxp97" event={"ID":"f5cad97d-c68c-4859-bc2b-746461b8b361","Type":"ContainerStarted","Data":"9e8b6047c5e968221211afd2f0332aca655e11a909bc38498e248c06745058f9"} Feb 15 17:11:47 crc kubenswrapper[4585]: I0215 17:11:47.150882 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tvwl7" Feb 15 17:11:47 crc kubenswrapper[4585]: I0215 17:11:47.161539 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nxp97" podStartSLOduration=4.042783929 podStartE2EDuration="9.161511375s" podCreationTimestamp="2026-02-15 17:11:38 +0000 UTC" firstStartedPulling="2026-02-15 17:11:41.676141407 +0000 UTC m=+357.619549539" lastFinishedPulling="2026-02-15 17:11:46.794868853 +0000 UTC m=+362.738276985" observedRunningTime="2026-02-15 17:11:47.158749229 +0000 UTC m=+363.102157361" watchObservedRunningTime="2026-02-15 17:11:47.161511375 +0000 UTC m=+363.104919507" Feb 15 17:11:47 crc kubenswrapper[4585]: I0215 17:11:47.164326 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85w5q" podStartSLOduration=2.046724929 podStartE2EDuration="7.164316553s" podCreationTimestamp="2026-02-15 17:11:40 +0000 UTC" firstStartedPulling="2026-02-15 17:11:41.668172748 +0000 UTC m=+357.611580880" lastFinishedPulling="2026-02-15 17:11:46.785764372 +0000 UTC m=+362.729172504" observedRunningTime="2026-02-15 17:11:47.142228413 +0000 UTC m=+363.085636565" watchObservedRunningTime="2026-02-15 17:11:47.164316553 +0000 UTC m=+363.107724685" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.069158 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.069222 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.118852 4585 generic.go:334] "Generic (PLEG): container finished" podID="ac617faa-373e-4ae0-8fe0-bbecee411a35" containerID="e91d183305557071774aadc1952b6cf5258038c7cfa8b32f026dd77d61ddc757" exitCode=0 Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.118914 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fsfv" event={"ID":"ac617faa-373e-4ae0-8fe0-bbecee411a35","Type":"ContainerDied","Data":"e91d183305557071774aadc1952b6cf5258038c7cfa8b32f026dd77d61ddc757"} Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.120654 4585 generic.go:334] "Generic (PLEG): container finished" podID="669fa704-8e84-4c96-abf3-11f606645785" containerID="04ead3ff2e6759b490956e8598825a59d3cbe87d4b347b56323b9106a2613a3a" exitCode=0 Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.120733 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcd69" event={"ID":"669fa704-8e84-4c96-abf3-11f606645785","Type":"ContainerDied","Data":"04ead3ff2e6759b490956e8598825a59d3cbe87d4b347b56323b9106a2613a3a"} Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.166681 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.227334 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8c7xf" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.368729 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x88c9"] Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.370145 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.387233 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x88c9"] Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.553741 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5489\" (UniqueName: \"kubernetes.io/projected/102ceb8e-9746-4512-b8c9-b00cec81b1a8-kube-api-access-j5489\") pod \"community-operators-x88c9\" (UID: \"102ceb8e-9746-4512-b8c9-b00cec81b1a8\") " pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.553881 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/102ceb8e-9746-4512-b8c9-b00cec81b1a8-catalog-content\") pod \"community-operators-x88c9\" (UID: \"102ceb8e-9746-4512-b8c9-b00cec81b1a8\") " pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.553933 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/102ceb8e-9746-4512-b8c9-b00cec81b1a8-utilities\") pod \"community-operators-x88c9\" (UID: \"102ceb8e-9746-4512-b8c9-b00cec81b1a8\") " pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.655015 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/102ceb8e-9746-4512-b8c9-b00cec81b1a8-catalog-content\") pod \"community-operators-x88c9\" (UID: \"102ceb8e-9746-4512-b8c9-b00cec81b1a8\") " pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.655082 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/102ceb8e-9746-4512-b8c9-b00cec81b1a8-utilities\") pod \"community-operators-x88c9\" (UID: \"102ceb8e-9746-4512-b8c9-b00cec81b1a8\") " pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.655191 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5489\" (UniqueName: \"kubernetes.io/projected/102ceb8e-9746-4512-b8c9-b00cec81b1a8-kube-api-access-j5489\") pod \"community-operators-x88c9\" (UID: \"102ceb8e-9746-4512-b8c9-b00cec81b1a8\") " pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.655455 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/102ceb8e-9746-4512-b8c9-b00cec81b1a8-catalog-content\") pod \"community-operators-x88c9\" (UID: \"102ceb8e-9746-4512-b8c9-b00cec81b1a8\") " pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.655742 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/102ceb8e-9746-4512-b8c9-b00cec81b1a8-utilities\") pod \"community-operators-x88c9\" (UID: \"102ceb8e-9746-4512-b8c9-b00cec81b1a8\") " pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.689478 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5489\" (UniqueName: \"kubernetes.io/projected/102ceb8e-9746-4512-b8c9-b00cec81b1a8-kube-api-access-j5489\") pod \"community-operators-x88c9\" (UID: \"102ceb8e-9746-4512-b8c9-b00cec81b1a8\") " pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:48 crc kubenswrapper[4585]: I0215 17:11:48.692329 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:49 crc kubenswrapper[4585]: I0215 17:11:49.134081 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5fsfv" event={"ID":"ac617faa-373e-4ae0-8fe0-bbecee411a35","Type":"ContainerStarted","Data":"bc7917e6250ab535f42e76c74be6f0d22ed7db67484f3bc89f62f04af4397ffd"} Feb 15 17:11:49 crc kubenswrapper[4585]: I0215 17:11:49.180620 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5fsfv" podStartSLOduration=3.247633451 podStartE2EDuration="8.180565752s" podCreationTimestamp="2026-02-15 17:11:41 +0000 UTC" firstStartedPulling="2026-02-15 17:11:43.758887181 +0000 UTC m=+359.702295313" lastFinishedPulling="2026-02-15 17:11:48.691819482 +0000 UTC m=+364.635227614" observedRunningTime="2026-02-15 17:11:49.180162201 +0000 UTC m=+365.123570333" watchObservedRunningTime="2026-02-15 17:11:49.180565752 +0000 UTC m=+365.123973894" Feb 15 17:11:49 crc kubenswrapper[4585]: I0215 17:11:49.295906 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x88c9"] Feb 15 17:11:49 crc kubenswrapper[4585]: I0215 17:11:49.365868 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:49 crc kubenswrapper[4585]: I0215 17:11:49.366162 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:50 crc kubenswrapper[4585]: I0215 17:11:50.143740 4585 generic.go:334] "Generic (PLEG): container finished" podID="669fa704-8e84-4c96-abf3-11f606645785" containerID="2eabf6529b7669dcc5a67f96eaebb5d65bec46682d80a82981316225194b286f" exitCode=0 Feb 15 17:11:50 crc kubenswrapper[4585]: I0215 17:11:50.143822 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcd69" event={"ID":"669fa704-8e84-4c96-abf3-11f606645785","Type":"ContainerDied","Data":"2eabf6529b7669dcc5a67f96eaebb5d65bec46682d80a82981316225194b286f"} Feb 15 17:11:50 crc kubenswrapper[4585]: I0215 17:11:50.146481 4585 generic.go:334] "Generic (PLEG): container finished" podID="102ceb8e-9746-4512-b8c9-b00cec81b1a8" containerID="f2543adc4f58547a91225a95315b80ef684eb24e7fd3f3a11be05e4c29f61f18" exitCode=0 Feb 15 17:11:50 crc kubenswrapper[4585]: I0215 17:11:50.146579 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x88c9" event={"ID":"102ceb8e-9746-4512-b8c9-b00cec81b1a8","Type":"ContainerDied","Data":"f2543adc4f58547a91225a95315b80ef684eb24e7fd3f3a11be05e4c29f61f18"} Feb 15 17:11:50 crc kubenswrapper[4585]: I0215 17:11:50.146664 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x88c9" event={"ID":"102ceb8e-9746-4512-b8c9-b00cec81b1a8","Type":"ContainerStarted","Data":"7d9ce6b69dfcd78a15ef9a8eaaa55cb134f6a8460d398529358a65d5f3cccd5e"} Feb 15 17:11:50 crc kubenswrapper[4585]: I0215 17:11:50.426175 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nxp97" podUID="f5cad97d-c68c-4859-bc2b-746461b8b361" containerName="registry-server" probeResult="failure" output=< Feb 15 17:11:50 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:11:50 crc kubenswrapper[4585]: > Feb 15 17:11:50 crc kubenswrapper[4585]: I0215 17:11:50.489426 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:50 crc kubenswrapper[4585]: I0215 17:11:50.489780 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:50 crc kubenswrapper[4585]: I0215 17:11:50.539148 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:51 crc kubenswrapper[4585]: I0215 17:11:51.154308 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcd69" event={"ID":"669fa704-8e84-4c96-abf3-11f606645785","Type":"ContainerStarted","Data":"dbe66a7947abc1b10da5d7a9d4c65b7abd67d116c1be5f0ef6db51cd3c28c4a7"} Feb 15 17:11:51 crc kubenswrapper[4585]: I0215 17:11:51.156077 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x88c9" event={"ID":"102ceb8e-9746-4512-b8c9-b00cec81b1a8","Type":"ContainerStarted","Data":"913deb9c4ec33be3784e22dfe7bdd06957843ec0686af5d6d5b78d06f4bda2f6"} Feb 15 17:11:51 crc kubenswrapper[4585]: I0215 17:11:51.180070 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mcd69" podStartSLOduration=5.645118994 podStartE2EDuration="8.180051948s" podCreationTimestamp="2026-02-15 17:11:43 +0000 UTC" firstStartedPulling="2026-02-15 17:11:48.12975075 +0000 UTC m=+364.073158892" lastFinishedPulling="2026-02-15 17:11:50.664683714 +0000 UTC m=+366.608091846" observedRunningTime="2026-02-15 17:11:51.176516851 +0000 UTC m=+367.119924983" watchObservedRunningTime="2026-02-15 17:11:51.180051948 +0000 UTC m=+367.123460080" Feb 15 17:11:51 crc kubenswrapper[4585]: I0215 17:11:51.225855 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85w5q" Feb 15 17:11:51 crc kubenswrapper[4585]: I0215 17:11:51.682682 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:51 crc kubenswrapper[4585]: I0215 17:11:51.682753 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:51 crc kubenswrapper[4585]: I0215 17:11:51.746816 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.152588 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xqp7w"] Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.154278 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.167049 4585 generic.go:334] "Generic (PLEG): container finished" podID="102ceb8e-9746-4512-b8c9-b00cec81b1a8" containerID="913deb9c4ec33be3784e22dfe7bdd06957843ec0686af5d6d5b78d06f4bda2f6" exitCode=0 Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.167125 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x88c9" event={"ID":"102ceb8e-9746-4512-b8c9-b00cec81b1a8","Type":"ContainerDied","Data":"913deb9c4ec33be3784e22dfe7bdd06957843ec0686af5d6d5b78d06f4bda2f6"} Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.172742 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xqp7w"] Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.330887 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1e8351-6185-4fdb-b97b-29efaea499cd-utilities\") pod \"community-operators-xqp7w\" (UID: \"ec1e8351-6185-4fdb-b97b-29efaea499cd\") " pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.330985 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-898gx\" (UniqueName: \"kubernetes.io/projected/ec1e8351-6185-4fdb-b97b-29efaea499cd-kube-api-access-898gx\") pod \"community-operators-xqp7w\" (UID: \"ec1e8351-6185-4fdb-b97b-29efaea499cd\") " pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.331705 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1e8351-6185-4fdb-b97b-29efaea499cd-catalog-content\") pod \"community-operators-xqp7w\" (UID: \"ec1e8351-6185-4fdb-b97b-29efaea499cd\") " pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.433393 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1e8351-6185-4fdb-b97b-29efaea499cd-utilities\") pod \"community-operators-xqp7w\" (UID: \"ec1e8351-6185-4fdb-b97b-29efaea499cd\") " pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.433455 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-898gx\" (UniqueName: \"kubernetes.io/projected/ec1e8351-6185-4fdb-b97b-29efaea499cd-kube-api-access-898gx\") pod \"community-operators-xqp7w\" (UID: \"ec1e8351-6185-4fdb-b97b-29efaea499cd\") " pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.433497 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1e8351-6185-4fdb-b97b-29efaea499cd-catalog-content\") pod \"community-operators-xqp7w\" (UID: \"ec1e8351-6185-4fdb-b97b-29efaea499cd\") " pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.433968 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec1e8351-6185-4fdb-b97b-29efaea499cd-catalog-content\") pod \"community-operators-xqp7w\" (UID: \"ec1e8351-6185-4fdb-b97b-29efaea499cd\") " pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.434179 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec1e8351-6185-4fdb-b97b-29efaea499cd-utilities\") pod \"community-operators-xqp7w\" (UID: \"ec1e8351-6185-4fdb-b97b-29efaea499cd\") " pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.455883 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-898gx\" (UniqueName: \"kubernetes.io/projected/ec1e8351-6185-4fdb-b97b-29efaea499cd-kube-api-access-898gx\") pod \"community-operators-xqp7w\" (UID: \"ec1e8351-6185-4fdb-b97b-29efaea499cd\") " pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:52 crc kubenswrapper[4585]: I0215 17:11:52.510108 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.040199 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xqp7w"] Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.195713 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqp7w" event={"ID":"ec1e8351-6185-4fdb-b97b-29efaea499cd","Type":"ContainerStarted","Data":"8520ca8c6633c4c65cee7f6ab8439f0b0884263b32e334685be3aac2ac8a6bdb"} Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.195779 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqp7w" event={"ID":"ec1e8351-6185-4fdb-b97b-29efaea499cd","Type":"ContainerStarted","Data":"403b7e612ace0bbf3a7f084d0533a2adb97646d8a6798e8a5fe0055cb7ea8b8e"} Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.201634 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x88c9" event={"ID":"102ceb8e-9746-4512-b8c9-b00cec81b1a8","Type":"ContainerStarted","Data":"2621e7ec832ca4ef268aebccc3eca200bb9e9ff5689e10b5d583910c65d2d7d6"} Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.236750 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x88c9" podStartSLOduration=2.792522979 podStartE2EDuration="5.236735372s" podCreationTimestamp="2026-02-15 17:11:48 +0000 UTC" firstStartedPulling="2026-02-15 17:11:50.148145717 +0000 UTC m=+366.091553849" lastFinishedPulling="2026-02-15 17:11:52.59235811 +0000 UTC m=+368.535766242" observedRunningTime="2026-02-15 17:11:53.231875458 +0000 UTC m=+369.175283630" watchObservedRunningTime="2026-02-15 17:11:53.236735372 +0000 UTC m=+369.180143504" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.355363 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xntlx"] Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.356763 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.374925 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xntlx"] Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.447684 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54df0770-f532-412f-985c-fa6514121ecf-utilities\") pod \"community-operators-xntlx\" (UID: \"54df0770-f532-412f-985c-fa6514121ecf\") " pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.447939 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54df0770-f532-412f-985c-fa6514121ecf-catalog-content\") pod \"community-operators-xntlx\" (UID: \"54df0770-f532-412f-985c-fa6514121ecf\") " pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.448009 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg6x2\" (UniqueName: \"kubernetes.io/projected/54df0770-f532-412f-985c-fa6514121ecf-kube-api-access-mg6x2\") pod \"community-operators-xntlx\" (UID: \"54df0770-f532-412f-985c-fa6514121ecf\") " pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.549760 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54df0770-f532-412f-985c-fa6514121ecf-utilities\") pod \"community-operators-xntlx\" (UID: \"54df0770-f532-412f-985c-fa6514121ecf\") " pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.549914 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54df0770-f532-412f-985c-fa6514121ecf-catalog-content\") pod \"community-operators-xntlx\" (UID: \"54df0770-f532-412f-985c-fa6514121ecf\") " pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.549979 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg6x2\" (UniqueName: \"kubernetes.io/projected/54df0770-f532-412f-985c-fa6514121ecf-kube-api-access-mg6x2\") pod \"community-operators-xntlx\" (UID: \"54df0770-f532-412f-985c-fa6514121ecf\") " pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.550767 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54df0770-f532-412f-985c-fa6514121ecf-utilities\") pod \"community-operators-xntlx\" (UID: \"54df0770-f532-412f-985c-fa6514121ecf\") " pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.551069 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54df0770-f532-412f-985c-fa6514121ecf-catalog-content\") pod \"community-operators-xntlx\" (UID: \"54df0770-f532-412f-985c-fa6514121ecf\") " pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.570299 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg6x2\" (UniqueName: \"kubernetes.io/projected/54df0770-f532-412f-985c-fa6514121ecf-kube-api-access-mg6x2\") pod \"community-operators-xntlx\" (UID: \"54df0770-f532-412f-985c-fa6514121ecf\") " pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:53 crc kubenswrapper[4585]: I0215 17:11:53.742237 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.208273 4585 generic.go:334] "Generic (PLEG): container finished" podID="ec1e8351-6185-4fdb-b97b-29efaea499cd" containerID="8520ca8c6633c4c65cee7f6ab8439f0b0884263b32e334685be3aac2ac8a6bdb" exitCode=0 Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.208368 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqp7w" event={"ID":"ec1e8351-6185-4fdb-b97b-29efaea499cd","Type":"ContainerDied","Data":"8520ca8c6633c4c65cee7f6ab8439f0b0884263b32e334685be3aac2ac8a6bdb"} Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.208816 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqp7w" event={"ID":"ec1e8351-6185-4fdb-b97b-29efaea499cd","Type":"ContainerStarted","Data":"d3323c5c4181c377f7dd116c319ff837affc41b969128e5c85839b1f4b55faf3"} Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.277361 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.277409 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.389504 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xntlx"] Feb 15 17:11:54 crc kubenswrapper[4585]: W0215 17:11:54.392259 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54df0770_f532_412f_985c_fa6514121ecf.slice/crio-631e3ddbea1f4d4c737e031d8bdaf7aca54812e76ea8b21e7d6620a9a986f8d0 WatchSource:0}: Error finding container 631e3ddbea1f4d4c737e031d8bdaf7aca54812e76ea8b21e7d6620a9a986f8d0: Status 404 returned error can't find the container with id 631e3ddbea1f4d4c737e031d8bdaf7aca54812e76ea8b21e7d6620a9a986f8d0 Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.550562 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jp2sv"] Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.551843 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.564897 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jp2sv"] Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.666188 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl7qv\" (UniqueName: \"kubernetes.io/projected/09f66f54-3515-402f-b986-e278748ef6d4-kube-api-access-jl7qv\") pod \"community-operators-jp2sv\" (UID: \"09f66f54-3515-402f-b986-e278748ef6d4\") " pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.666252 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f66f54-3515-402f-b986-e278748ef6d4-catalog-content\") pod \"community-operators-jp2sv\" (UID: \"09f66f54-3515-402f-b986-e278748ef6d4\") " pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.666283 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f66f54-3515-402f-b986-e278748ef6d4-utilities\") pod \"community-operators-jp2sv\" (UID: \"09f66f54-3515-402f-b986-e278748ef6d4\") " pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.767482 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl7qv\" (UniqueName: \"kubernetes.io/projected/09f66f54-3515-402f-b986-e278748ef6d4-kube-api-access-jl7qv\") pod \"community-operators-jp2sv\" (UID: \"09f66f54-3515-402f-b986-e278748ef6d4\") " pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.767552 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f66f54-3515-402f-b986-e278748ef6d4-catalog-content\") pod \"community-operators-jp2sv\" (UID: \"09f66f54-3515-402f-b986-e278748ef6d4\") " pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.767582 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f66f54-3515-402f-b986-e278748ef6d4-utilities\") pod \"community-operators-jp2sv\" (UID: \"09f66f54-3515-402f-b986-e278748ef6d4\") " pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.768090 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f66f54-3515-402f-b986-e278748ef6d4-catalog-content\") pod \"community-operators-jp2sv\" (UID: \"09f66f54-3515-402f-b986-e278748ef6d4\") " pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.768137 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f66f54-3515-402f-b986-e278748ef6d4-utilities\") pod \"community-operators-jp2sv\" (UID: \"09f66f54-3515-402f-b986-e278748ef6d4\") " pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.784863 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl7qv\" (UniqueName: \"kubernetes.io/projected/09f66f54-3515-402f-b986-e278748ef6d4-kube-api-access-jl7qv\") pod \"community-operators-jp2sv\" (UID: \"09f66f54-3515-402f-b986-e278748ef6d4\") " pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:54 crc kubenswrapper[4585]: I0215 17:11:54.949974 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.236038 4585 generic.go:334] "Generic (PLEG): container finished" podID="ec1e8351-6185-4fdb-b97b-29efaea499cd" containerID="d3323c5c4181c377f7dd116c319ff837affc41b969128e5c85839b1f4b55faf3" exitCode=0 Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.236107 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqp7w" event={"ID":"ec1e8351-6185-4fdb-b97b-29efaea499cd","Type":"ContainerDied","Data":"d3323c5c4181c377f7dd116c319ff837affc41b969128e5c85839b1f4b55faf3"} Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.253022 4585 generic.go:334] "Generic (PLEG): container finished" podID="54df0770-f532-412f-985c-fa6514121ecf" containerID="beab2d37fdaf4bfad7e7fe6a39a2ecbe6c91906a06e9601312c521864852fc3b" exitCode=0 Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.253057 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xntlx" event={"ID":"54df0770-f532-412f-985c-fa6514121ecf","Type":"ContainerDied","Data":"beab2d37fdaf4bfad7e7fe6a39a2ecbe6c91906a06e9601312c521864852fc3b"} Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.253081 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xntlx" event={"ID":"54df0770-f532-412f-985c-fa6514121ecf","Type":"ContainerStarted","Data":"631e3ddbea1f4d4c737e031d8bdaf7aca54812e76ea8b21e7d6620a9a986f8d0"} Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.378876 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mcd69" podUID="669fa704-8e84-4c96-abf3-11f606645785" containerName="registry-server" probeResult="failure" output=< Feb 15 17:11:55 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:11:55 crc kubenswrapper[4585]: > Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.500857 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jp2sv"] Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.762496 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9h2zb"] Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.764171 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.820669 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9h2zb"] Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.913495 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8137661-759d-4fea-8a45-47e438c9fbe6-utilities\") pod \"community-operators-9h2zb\" (UID: \"e8137661-759d-4fea-8a45-47e438c9fbe6\") " pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.913552 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8137661-759d-4fea-8a45-47e438c9fbe6-catalog-content\") pod \"community-operators-9h2zb\" (UID: \"e8137661-759d-4fea-8a45-47e438c9fbe6\") " pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:55 crc kubenswrapper[4585]: I0215 17:11:55.913737 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4dz\" (UniqueName: \"kubernetes.io/projected/e8137661-759d-4fea-8a45-47e438c9fbe6-kube-api-access-zs4dz\") pod \"community-operators-9h2zb\" (UID: \"e8137661-759d-4fea-8a45-47e438c9fbe6\") " pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.014854 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8137661-759d-4fea-8a45-47e438c9fbe6-utilities\") pod \"community-operators-9h2zb\" (UID: \"e8137661-759d-4fea-8a45-47e438c9fbe6\") " pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.014918 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8137661-759d-4fea-8a45-47e438c9fbe6-catalog-content\") pod \"community-operators-9h2zb\" (UID: \"e8137661-759d-4fea-8a45-47e438c9fbe6\") " pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.014985 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4dz\" (UniqueName: \"kubernetes.io/projected/e8137661-759d-4fea-8a45-47e438c9fbe6-kube-api-access-zs4dz\") pod \"community-operators-9h2zb\" (UID: \"e8137661-759d-4fea-8a45-47e438c9fbe6\") " pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.015683 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8137661-759d-4fea-8a45-47e438c9fbe6-catalog-content\") pod \"community-operators-9h2zb\" (UID: \"e8137661-759d-4fea-8a45-47e438c9fbe6\") " pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.015739 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8137661-759d-4fea-8a45-47e438c9fbe6-utilities\") pod \"community-operators-9h2zb\" (UID: \"e8137661-759d-4fea-8a45-47e438c9fbe6\") " pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.034715 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4dz\" (UniqueName: \"kubernetes.io/projected/e8137661-759d-4fea-8a45-47e438c9fbe6-kube-api-access-zs4dz\") pod \"community-operators-9h2zb\" (UID: \"e8137661-759d-4fea-8a45-47e438c9fbe6\") " pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.140647 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.269854 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqp7w" event={"ID":"ec1e8351-6185-4fdb-b97b-29efaea499cd","Type":"ContainerStarted","Data":"6dd802a2cee96f545a26d9f18a79df00639accf15c5de0f4cb0585bcbb47d8ae"} Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.271466 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xntlx" event={"ID":"54df0770-f532-412f-985c-fa6514121ecf","Type":"ContainerStarted","Data":"168df22087e9f06f7e4c8ced2559d5f54e48c2c8d4f91a6cbc7d36c90dfdbb74"} Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.277647 4585 generic.go:334] "Generic (PLEG): container finished" podID="09f66f54-3515-402f-b986-e278748ef6d4" containerID="1b97ba802844f928f025fddf075717e4d421beafb5c703dfdf9725f5e7b6fae8" exitCode=0 Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.277699 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jp2sv" event={"ID":"09f66f54-3515-402f-b986-e278748ef6d4","Type":"ContainerDied","Data":"1b97ba802844f928f025fddf075717e4d421beafb5c703dfdf9725f5e7b6fae8"} Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.277731 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jp2sv" event={"ID":"09f66f54-3515-402f-b986-e278748ef6d4","Type":"ContainerStarted","Data":"c0de6f3df8bc393966f0df72b4dcffd47d5de4c584bb68ed35a74a513c0ae00d"} Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.308798 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xqp7w" podStartSLOduration=1.8811698350000001 podStartE2EDuration="4.30878243s" podCreationTimestamp="2026-02-15 17:11:52 +0000 UTC" firstStartedPulling="2026-02-15 17:11:53.198684212 +0000 UTC m=+369.142092344" lastFinishedPulling="2026-02-15 17:11:55.626296807 +0000 UTC m=+371.569704939" observedRunningTime="2026-02-15 17:11:56.304260135 +0000 UTC m=+372.247668267" watchObservedRunningTime="2026-02-15 17:11:56.30878243 +0000 UTC m=+372.252190562" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.702456 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9h2zb"] Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.963999 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xblvr"] Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.965314 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:56 crc kubenswrapper[4585]: I0215 17:11:56.974255 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xblvr"] Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.042637 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26ccbe5c-984a-4e61-9897-6357cdd14cab-utilities\") pod \"community-operators-xblvr\" (UID: \"26ccbe5c-984a-4e61-9897-6357cdd14cab\") " pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.042732 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26ccbe5c-984a-4e61-9897-6357cdd14cab-catalog-content\") pod \"community-operators-xblvr\" (UID: \"26ccbe5c-984a-4e61-9897-6357cdd14cab\") " pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.042814 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rps65\" (UniqueName: \"kubernetes.io/projected/26ccbe5c-984a-4e61-9897-6357cdd14cab-kube-api-access-rps65\") pod \"community-operators-xblvr\" (UID: \"26ccbe5c-984a-4e61-9897-6357cdd14cab\") " pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.144711 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26ccbe5c-984a-4e61-9897-6357cdd14cab-catalog-content\") pod \"community-operators-xblvr\" (UID: \"26ccbe5c-984a-4e61-9897-6357cdd14cab\") " pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.144797 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rps65\" (UniqueName: \"kubernetes.io/projected/26ccbe5c-984a-4e61-9897-6357cdd14cab-kube-api-access-rps65\") pod \"community-operators-xblvr\" (UID: \"26ccbe5c-984a-4e61-9897-6357cdd14cab\") " pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.144831 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26ccbe5c-984a-4e61-9897-6357cdd14cab-utilities\") pod \"community-operators-xblvr\" (UID: \"26ccbe5c-984a-4e61-9897-6357cdd14cab\") " pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.145213 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26ccbe5c-984a-4e61-9897-6357cdd14cab-utilities\") pod \"community-operators-xblvr\" (UID: \"26ccbe5c-984a-4e61-9897-6357cdd14cab\") " pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.145439 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26ccbe5c-984a-4e61-9897-6357cdd14cab-catalog-content\") pod \"community-operators-xblvr\" (UID: \"26ccbe5c-984a-4e61-9897-6357cdd14cab\") " pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.171001 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rps65\" (UniqueName: \"kubernetes.io/projected/26ccbe5c-984a-4e61-9897-6357cdd14cab-kube-api-access-rps65\") pod \"community-operators-xblvr\" (UID: \"26ccbe5c-984a-4e61-9897-6357cdd14cab\") " pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.284809 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jp2sv" event={"ID":"09f66f54-3515-402f-b986-e278748ef6d4","Type":"ContainerStarted","Data":"02d5fb12dae76ceb91a8aacfb681aa7075b729715d057c048bca9b338f7a14de"} Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.286909 4585 generic.go:334] "Generic (PLEG): container finished" podID="54df0770-f532-412f-985c-fa6514121ecf" containerID="168df22087e9f06f7e4c8ced2559d5f54e48c2c8d4f91a6cbc7d36c90dfdbb74" exitCode=0 Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.286953 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xntlx" event={"ID":"54df0770-f532-412f-985c-fa6514121ecf","Type":"ContainerDied","Data":"168df22087e9f06f7e4c8ced2559d5f54e48c2c8d4f91a6cbc7d36c90dfdbb74"} Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.288712 4585 generic.go:334] "Generic (PLEG): container finished" podID="e8137661-759d-4fea-8a45-47e438c9fbe6" containerID="d4f09a1ea40ca29ff441c46c29a1215c2b8222bcae58e2fc95559f6c50e7a372" exitCode=0 Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.289497 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9h2zb" event={"ID":"e8137661-759d-4fea-8a45-47e438c9fbe6","Type":"ContainerDied","Data":"d4f09a1ea40ca29ff441c46c29a1215c2b8222bcae58e2fc95559f6c50e7a372"} Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.289512 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9h2zb" event={"ID":"e8137661-759d-4fea-8a45-47e438c9fbe6","Type":"ContainerStarted","Data":"b7829949b3e9a67d62e382f7d18e31cb7eba38811d4ee94af0be0b6078c46841"} Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.339935 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:11:57 crc kubenswrapper[4585]: I0215 17:11:57.924085 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xblvr"] Feb 15 17:11:57 crc kubenswrapper[4585]: W0215 17:11:57.934153 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26ccbe5c_984a_4e61_9897_6357cdd14cab.slice/crio-e94e9883aad77d2831ab58eea9ec3c323d6f0c3b5e69a40c5f4e2453c9794a44 WatchSource:0}: Error finding container e94e9883aad77d2831ab58eea9ec3c323d6f0c3b5e69a40c5f4e2453c9794a44: Status 404 returned error can't find the container with id e94e9883aad77d2831ab58eea9ec3c323d6f0c3b5e69a40c5f4e2453c9794a44 Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.154707 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vcnr2"] Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.156417 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.165738 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcnr2"] Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.267859 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4ht9\" (UniqueName: \"kubernetes.io/projected/62d2ce7a-0eba-4d09-8c96-260ff647e3b2-kube-api-access-w4ht9\") pod \"community-operators-vcnr2\" (UID: \"62d2ce7a-0eba-4d09-8c96-260ff647e3b2\") " pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.268172 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62d2ce7a-0eba-4d09-8c96-260ff647e3b2-utilities\") pod \"community-operators-vcnr2\" (UID: \"62d2ce7a-0eba-4d09-8c96-260ff647e3b2\") " pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.268194 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62d2ce7a-0eba-4d09-8c96-260ff647e3b2-catalog-content\") pod \"community-operators-vcnr2\" (UID: \"62d2ce7a-0eba-4d09-8c96-260ff647e3b2\") " pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.295769 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9h2zb" event={"ID":"e8137661-759d-4fea-8a45-47e438c9fbe6","Type":"ContainerStarted","Data":"7bb7259c2d163362abce4fc5e128975a337617459240ccc780afb3d5d3258144"} Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.298787 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xntlx" event={"ID":"54df0770-f532-412f-985c-fa6514121ecf","Type":"ContainerStarted","Data":"dda7f9d5e890a46a3039c624313d7642baa54091d4201b3c6ea9f8acfe9ab91e"} Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.300356 4585 generic.go:334] "Generic (PLEG): container finished" podID="09f66f54-3515-402f-b986-e278748ef6d4" containerID="02d5fb12dae76ceb91a8aacfb681aa7075b729715d057c048bca9b338f7a14de" exitCode=0 Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.300409 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jp2sv" event={"ID":"09f66f54-3515-402f-b986-e278748ef6d4","Type":"ContainerDied","Data":"02d5fb12dae76ceb91a8aacfb681aa7075b729715d057c048bca9b338f7a14de"} Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.302129 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xblvr" event={"ID":"26ccbe5c-984a-4e61-9897-6357cdd14cab","Type":"ContainerStarted","Data":"990a88f46a3c0a038b1efb0850b037c68fecce7552c345fe70b80ba6c5e3ae6b"} Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.302164 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xblvr" event={"ID":"26ccbe5c-984a-4e61-9897-6357cdd14cab","Type":"ContainerStarted","Data":"e94e9883aad77d2831ab58eea9ec3c323d6f0c3b5e69a40c5f4e2453c9794a44"} Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.333000 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xntlx" podStartSLOduration=2.708246687 podStartE2EDuration="5.332985588s" podCreationTimestamp="2026-02-15 17:11:53 +0000 UTC" firstStartedPulling="2026-02-15 17:11:55.254649016 +0000 UTC m=+371.198057148" lastFinishedPulling="2026-02-15 17:11:57.879387917 +0000 UTC m=+373.822796049" observedRunningTime="2026-02-15 17:11:58.329554793 +0000 UTC m=+374.272962925" watchObservedRunningTime="2026-02-15 17:11:58.332985588 +0000 UTC m=+374.276393720" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.373822 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ht9\" (UniqueName: \"kubernetes.io/projected/62d2ce7a-0eba-4d09-8c96-260ff647e3b2-kube-api-access-w4ht9\") pod \"community-operators-vcnr2\" (UID: \"62d2ce7a-0eba-4d09-8c96-260ff647e3b2\") " pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.373890 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62d2ce7a-0eba-4d09-8c96-260ff647e3b2-utilities\") pod \"community-operators-vcnr2\" (UID: \"62d2ce7a-0eba-4d09-8c96-260ff647e3b2\") " pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.373908 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62d2ce7a-0eba-4d09-8c96-260ff647e3b2-catalog-content\") pod \"community-operators-vcnr2\" (UID: \"62d2ce7a-0eba-4d09-8c96-260ff647e3b2\") " pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.374408 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62d2ce7a-0eba-4d09-8c96-260ff647e3b2-catalog-content\") pod \"community-operators-vcnr2\" (UID: \"62d2ce7a-0eba-4d09-8c96-260ff647e3b2\") " pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.374898 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62d2ce7a-0eba-4d09-8c96-260ff647e3b2-utilities\") pod \"community-operators-vcnr2\" (UID: \"62d2ce7a-0eba-4d09-8c96-260ff647e3b2\") " pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.392443 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ht9\" (UniqueName: \"kubernetes.io/projected/62d2ce7a-0eba-4d09-8c96-260ff647e3b2-kube-api-access-w4ht9\") pod \"community-operators-vcnr2\" (UID: \"62d2ce7a-0eba-4d09-8c96-260ff647e3b2\") " pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.519762 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.693562 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.693931 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.796343 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:58 crc kubenswrapper[4585]: I0215 17:11:58.994583 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcnr2"] Feb 15 17:11:58 crc kubenswrapper[4585]: W0215 17:11:58.997174 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d2ce7a_0eba_4d09_8c96_260ff647e3b2.slice/crio-6132006b1f03545427925b768e76673fbd5e931be70a8f7460e0ac55425ef123 WatchSource:0}: Error finding container 6132006b1f03545427925b768e76673fbd5e931be70a8f7460e0ac55425ef123: Status 404 returned error can't find the container with id 6132006b1f03545427925b768e76673fbd5e931be70a8f7460e0ac55425ef123 Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.316498 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jp2sv" event={"ID":"09f66f54-3515-402f-b986-e278748ef6d4","Type":"ContainerStarted","Data":"7389f0dd28b3053fa210d354507945511f0edbc3d53c57bdbd7e2dbc13582618"} Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.319153 4585 generic.go:334] "Generic (PLEG): container finished" podID="26ccbe5c-984a-4e61-9897-6357cdd14cab" containerID="990a88f46a3c0a038b1efb0850b037c68fecce7552c345fe70b80ba6c5e3ae6b" exitCode=0 Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.319544 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xblvr" event={"ID":"26ccbe5c-984a-4e61-9897-6357cdd14cab","Type":"ContainerDied","Data":"990a88f46a3c0a038b1efb0850b037c68fecce7552c345fe70b80ba6c5e3ae6b"} Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.331390 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcnr2" event={"ID":"62d2ce7a-0eba-4d09-8c96-260ff647e3b2","Type":"ContainerStarted","Data":"a54a9054661aa26933457b1eec857a3fe5561848e76973ffc52fbaea39e52cd1"} Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.331717 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcnr2" event={"ID":"62d2ce7a-0eba-4d09-8c96-260ff647e3b2","Type":"ContainerStarted","Data":"6132006b1f03545427925b768e76673fbd5e931be70a8f7460e0ac55425ef123"} Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.339277 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jp2sv" podStartSLOduration=2.858075629 podStartE2EDuration="5.339265651s" podCreationTimestamp="2026-02-15 17:11:54 +0000 UTC" firstStartedPulling="2026-02-15 17:11:56.279359768 +0000 UTC m=+372.222767900" lastFinishedPulling="2026-02-15 17:11:58.76054979 +0000 UTC m=+374.703957922" observedRunningTime="2026-02-15 17:11:59.337154994 +0000 UTC m=+375.280563116" watchObservedRunningTime="2026-02-15 17:11:59.339265651 +0000 UTC m=+375.282673783" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.367773 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vwfmx"] Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.369464 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.395698 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vwfmx"] Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.395879 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x88c9" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.409955 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.481838 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nxp97" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.491387 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372e5a42-5ade-4cc6-a41f-72d5119f2e64-catalog-content\") pod \"community-operators-vwfmx\" (UID: \"372e5a42-5ade-4cc6-a41f-72d5119f2e64\") " pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.491436 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372e5a42-5ade-4cc6-a41f-72d5119f2e64-utilities\") pod \"community-operators-vwfmx\" (UID: \"372e5a42-5ade-4cc6-a41f-72d5119f2e64\") " pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.491502 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7wtw\" (UniqueName: \"kubernetes.io/projected/372e5a42-5ade-4cc6-a41f-72d5119f2e64-kube-api-access-f7wtw\") pod \"community-operators-vwfmx\" (UID: \"372e5a42-5ade-4cc6-a41f-72d5119f2e64\") " pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.592869 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372e5a42-5ade-4cc6-a41f-72d5119f2e64-catalog-content\") pod \"community-operators-vwfmx\" (UID: \"372e5a42-5ade-4cc6-a41f-72d5119f2e64\") " pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.592925 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372e5a42-5ade-4cc6-a41f-72d5119f2e64-utilities\") pod \"community-operators-vwfmx\" (UID: \"372e5a42-5ade-4cc6-a41f-72d5119f2e64\") " pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.592960 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7wtw\" (UniqueName: \"kubernetes.io/projected/372e5a42-5ade-4cc6-a41f-72d5119f2e64-kube-api-access-f7wtw\") pod \"community-operators-vwfmx\" (UID: \"372e5a42-5ade-4cc6-a41f-72d5119f2e64\") " pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.593650 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/372e5a42-5ade-4cc6-a41f-72d5119f2e64-catalog-content\") pod \"community-operators-vwfmx\" (UID: \"372e5a42-5ade-4cc6-a41f-72d5119f2e64\") " pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.593865 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/372e5a42-5ade-4cc6-a41f-72d5119f2e64-utilities\") pod \"community-operators-vwfmx\" (UID: \"372e5a42-5ade-4cc6-a41f-72d5119f2e64\") " pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.627953 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7wtw\" (UniqueName: \"kubernetes.io/projected/372e5a42-5ade-4cc6-a41f-72d5119f2e64-kube-api-access-f7wtw\") pod \"community-operators-vwfmx\" (UID: \"372e5a42-5ade-4cc6-a41f-72d5119f2e64\") " pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:11:59 crc kubenswrapper[4585]: I0215 17:11:59.683499 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.245068 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vwfmx"] Feb 15 17:12:00 crc kubenswrapper[4585]: W0215 17:12:00.253855 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod372e5a42_5ade_4cc6_a41f_72d5119f2e64.slice/crio-6e20aff490c6f5562941a4469b3a4f9b117c55e0e25e236824de48e570e2d1a0 WatchSource:0}: Error finding container 6e20aff490c6f5562941a4469b3a4f9b117c55e0e25e236824de48e570e2d1a0: Status 404 returned error can't find the container with id 6e20aff490c6f5562941a4469b3a4f9b117c55e0e25e236824de48e570e2d1a0 Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.341011 4585 generic.go:334] "Generic (PLEG): container finished" podID="e8137661-759d-4fea-8a45-47e438c9fbe6" containerID="7bb7259c2d163362abce4fc5e128975a337617459240ccc780afb3d5d3258144" exitCode=0 Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.341694 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9h2zb" event={"ID":"e8137661-759d-4fea-8a45-47e438c9fbe6","Type":"ContainerDied","Data":"7bb7259c2d163362abce4fc5e128975a337617459240ccc780afb3d5d3258144"} Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.344067 4585 generic.go:334] "Generic (PLEG): container finished" podID="62d2ce7a-0eba-4d09-8c96-260ff647e3b2" containerID="a54a9054661aa26933457b1eec857a3fe5561848e76973ffc52fbaea39e52cd1" exitCode=0 Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.344116 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcnr2" event={"ID":"62d2ce7a-0eba-4d09-8c96-260ff647e3b2","Type":"ContainerDied","Data":"a54a9054661aa26933457b1eec857a3fe5561848e76973ffc52fbaea39e52cd1"} Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.348614 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xblvr" event={"ID":"26ccbe5c-984a-4e61-9897-6357cdd14cab","Type":"ContainerStarted","Data":"ac639afa376efd05ddf12172053b37a0c4207ff3a0b64f52c9b579307d682763"} Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.351373 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfmx" event={"ID":"372e5a42-5ade-4cc6-a41f-72d5119f2e64","Type":"ContainerStarted","Data":"6e20aff490c6f5562941a4469b3a4f9b117c55e0e25e236824de48e570e2d1a0"} Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.556965 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-whfqg"] Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.558927 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.573731 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-whfqg"] Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.708821 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc442\" (UniqueName: \"kubernetes.io/projected/e8336aa7-5445-4d29-a676-e4a588683b09-kube-api-access-sc442\") pod \"community-operators-whfqg\" (UID: \"e8336aa7-5445-4d29-a676-e4a588683b09\") " pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.708889 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8336aa7-5445-4d29-a676-e4a588683b09-utilities\") pod \"community-operators-whfqg\" (UID: \"e8336aa7-5445-4d29-a676-e4a588683b09\") " pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.708931 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8336aa7-5445-4d29-a676-e4a588683b09-catalog-content\") pod \"community-operators-whfqg\" (UID: \"e8336aa7-5445-4d29-a676-e4a588683b09\") " pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.810820 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc442\" (UniqueName: \"kubernetes.io/projected/e8336aa7-5445-4d29-a676-e4a588683b09-kube-api-access-sc442\") pod \"community-operators-whfqg\" (UID: \"e8336aa7-5445-4d29-a676-e4a588683b09\") " pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.811266 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8336aa7-5445-4d29-a676-e4a588683b09-utilities\") pod \"community-operators-whfqg\" (UID: \"e8336aa7-5445-4d29-a676-e4a588683b09\") " pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.811426 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8336aa7-5445-4d29-a676-e4a588683b09-catalog-content\") pod \"community-operators-whfqg\" (UID: \"e8336aa7-5445-4d29-a676-e4a588683b09\") " pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.811988 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8336aa7-5445-4d29-a676-e4a588683b09-catalog-content\") pod \"community-operators-whfqg\" (UID: \"e8336aa7-5445-4d29-a676-e4a588683b09\") " pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.812345 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8336aa7-5445-4d29-a676-e4a588683b09-utilities\") pod \"community-operators-whfqg\" (UID: \"e8336aa7-5445-4d29-a676-e4a588683b09\") " pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.831507 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc442\" (UniqueName: \"kubernetes.io/projected/e8336aa7-5445-4d29-a676-e4a588683b09-kube-api-access-sc442\") pod \"community-operators-whfqg\" (UID: \"e8336aa7-5445-4d29-a676-e4a588683b09\") " pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:00 crc kubenswrapper[4585]: I0215 17:12:00.892651 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.372829 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9h2zb" event={"ID":"e8137661-759d-4fea-8a45-47e438c9fbe6","Type":"ContainerStarted","Data":"c71ca90e54a638c277a404c51a0fed0d6c2dffd3d564ee4a4975aacf61ac8ea8"} Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.378376 4585 generic.go:334] "Generic (PLEG): container finished" podID="26ccbe5c-984a-4e61-9897-6357cdd14cab" containerID="ac639afa376efd05ddf12172053b37a0c4207ff3a0b64f52c9b579307d682763" exitCode=0 Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.378431 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xblvr" event={"ID":"26ccbe5c-984a-4e61-9897-6357cdd14cab","Type":"ContainerDied","Data":"ac639afa376efd05ddf12172053b37a0c4207ff3a0b64f52c9b579307d682763"} Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.381334 4585 generic.go:334] "Generic (PLEG): container finished" podID="372e5a42-5ade-4cc6-a41f-72d5119f2e64" containerID="4f00aaeb46b94489cf3006612484b06aee39c983e76ca48e88dc4031998f7114" exitCode=0 Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.381369 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfmx" event={"ID":"372e5a42-5ade-4cc6-a41f-72d5119f2e64","Type":"ContainerDied","Data":"4f00aaeb46b94489cf3006612484b06aee39c983e76ca48e88dc4031998f7114"} Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.395325 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9h2zb" podStartSLOduration=2.950756925 podStartE2EDuration="6.395312237s" podCreationTimestamp="2026-02-15 17:11:55 +0000 UTC" firstStartedPulling="2026-02-15 17:11:57.291348479 +0000 UTC m=+373.234756611" lastFinishedPulling="2026-02-15 17:12:00.735903801 +0000 UTC m=+376.679311923" observedRunningTime="2026-02-15 17:12:01.392321135 +0000 UTC m=+377.335729267" watchObservedRunningTime="2026-02-15 17:12:01.395312237 +0000 UTC m=+377.338720369" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.440538 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-whfqg"] Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.756541 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5fsfv" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.762743 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-spm6s"] Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.764501 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.794716 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-spm6s"] Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.838794 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4f72\" (UniqueName: \"kubernetes.io/projected/ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe-kube-api-access-k4f72\") pod \"community-operators-spm6s\" (UID: \"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe\") " pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.838909 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe-utilities\") pod \"community-operators-spm6s\" (UID: \"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe\") " pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.838939 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe-catalog-content\") pod \"community-operators-spm6s\" (UID: \"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe\") " pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.940315 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe-catalog-content\") pod \"community-operators-spm6s\" (UID: \"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe\") " pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.940401 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4f72\" (UniqueName: \"kubernetes.io/projected/ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe-kube-api-access-k4f72\") pod \"community-operators-spm6s\" (UID: \"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe\") " pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.940538 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe-utilities\") pod \"community-operators-spm6s\" (UID: \"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe\") " pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.940988 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe-catalog-content\") pod \"community-operators-spm6s\" (UID: \"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe\") " pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.940994 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe-utilities\") pod \"community-operators-spm6s\" (UID: \"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe\") " pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:01 crc kubenswrapper[4585]: I0215 17:12:01.995704 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4f72\" (UniqueName: \"kubernetes.io/projected/ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe-kube-api-access-k4f72\") pod \"community-operators-spm6s\" (UID: \"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe\") " pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.094249 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.419910 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfmx" event={"ID":"372e5a42-5ade-4cc6-a41f-72d5119f2e64","Type":"ContainerStarted","Data":"d33af767d759dce1673333475a04e6a30f5e356edf15cde1034585f0d049e39a"} Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.427346 4585 generic.go:334] "Generic (PLEG): container finished" podID="62d2ce7a-0eba-4d09-8c96-260ff647e3b2" containerID="a49cfdbd4770935b32d964b39c8682ed5baa9c7bbd755a0ce373df71d1ca94ec" exitCode=0 Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.427414 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcnr2" event={"ID":"62d2ce7a-0eba-4d09-8c96-260ff647e3b2","Type":"ContainerDied","Data":"a49cfdbd4770935b32d964b39c8682ed5baa9c7bbd755a0ce373df71d1ca94ec"} Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.432807 4585 generic.go:334] "Generic (PLEG): container finished" podID="e8336aa7-5445-4d29-a676-e4a588683b09" containerID="52b513789ec629d861ee62af342eb69214d713012bf3b4f50a397edb8f30c291" exitCode=0 Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.432866 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-whfqg" event={"ID":"e8336aa7-5445-4d29-a676-e4a588683b09","Type":"ContainerDied","Data":"52b513789ec629d861ee62af342eb69214d713012bf3b4f50a397edb8f30c291"} Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.432890 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-whfqg" event={"ID":"e8336aa7-5445-4d29-a676-e4a588683b09","Type":"ContainerStarted","Data":"af5208b198f35292b0c5bba49d06089b905ded6b1ad0ea87a048227569dc22e1"} Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.451728 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xblvr" event={"ID":"26ccbe5c-984a-4e61-9897-6357cdd14cab","Type":"ContainerStarted","Data":"fd1405ac508af97afcf027a314e97bb8f8968c43897b996cc2f810d71f2f4c8f"} Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.510819 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.510873 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.525156 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xblvr" podStartSLOduration=4.051056773 podStartE2EDuration="6.525137649s" podCreationTimestamp="2026-02-15 17:11:56 +0000 UTC" firstStartedPulling="2026-02-15 17:11:59.327810345 +0000 UTC m=+375.271218477" lastFinishedPulling="2026-02-15 17:12:01.801891221 +0000 UTC m=+377.745299353" observedRunningTime="2026-02-15 17:12:02.518778323 +0000 UTC m=+378.462186475" watchObservedRunningTime="2026-02-15 17:12:02.525137649 +0000 UTC m=+378.468545771" Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.554795 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-spm6s"] Feb 15 17:12:02 crc kubenswrapper[4585]: W0215 17:12:02.563166 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad8c0d7b_9e86_4762_a59b_6e6bc4186bfe.slice/crio-d044a2a50106a95302da5e70dfcbbdea9cccb30429c7aef291c98c54b0f926af WatchSource:0}: Error finding container d044a2a50106a95302da5e70dfcbbdea9cccb30429c7aef291c98c54b0f926af: Status 404 returned error can't find the container with id d044a2a50106a95302da5e70dfcbbdea9cccb30429c7aef291c98c54b0f926af Feb 15 17:12:02 crc kubenswrapper[4585]: I0215 17:12:02.599176 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.460425 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-whfqg" event={"ID":"e8336aa7-5445-4d29-a676-e4a588683b09","Type":"ContainerStarted","Data":"9b75d3941160b129bf0617d802c5713478bbad54ec7c636a15259396ff576f92"} Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.462965 4585 generic.go:334] "Generic (PLEG): container finished" podID="372e5a42-5ade-4cc6-a41f-72d5119f2e64" containerID="d33af767d759dce1673333475a04e6a30f5e356edf15cde1034585f0d049e39a" exitCode=0 Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.463058 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfmx" event={"ID":"372e5a42-5ade-4cc6-a41f-72d5119f2e64","Type":"ContainerDied","Data":"d33af767d759dce1673333475a04e6a30f5e356edf15cde1034585f0d049e39a"} Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.465716 4585 generic.go:334] "Generic (PLEG): container finished" podID="ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe" containerID="149e4e2894042ff494e8c757a4923c25d9caf42474f3649f027d56422420592d" exitCode=0 Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.465798 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spm6s" event={"ID":"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe","Type":"ContainerDied","Data":"149e4e2894042ff494e8c757a4923c25d9caf42474f3649f027d56422420592d"} Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.465818 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spm6s" event={"ID":"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe","Type":"ContainerStarted","Data":"d044a2a50106a95302da5e70dfcbbdea9cccb30429c7aef291c98c54b0f926af"} Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.470476 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcnr2" event={"ID":"62d2ce7a-0eba-4d09-8c96-260ff647e3b2","Type":"ContainerStarted","Data":"cfbdcc651b46941c0ec5ee8347d8b528092c7ffe28da114fdd3760202ef4587f"} Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.555103 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xqp7w" Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.567157 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vcnr2" podStartSLOduration=3.074377287 podStartE2EDuration="5.567144268s" podCreationTimestamp="2026-02-15 17:11:58 +0000 UTC" firstStartedPulling="2026-02-15 17:12:00.347767576 +0000 UTC m=+376.291175708" lastFinishedPulling="2026-02-15 17:12:02.840534557 +0000 UTC m=+378.783942689" observedRunningTime="2026-02-15 17:12:03.560421112 +0000 UTC m=+379.503829244" watchObservedRunningTime="2026-02-15 17:12:03.567144268 +0000 UTC m=+379.510552400" Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.743103 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:12:03 crc kubenswrapper[4585]: I0215 17:12:03.743196 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.330222 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.380234 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mcd69" Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.476371 4585 generic.go:334] "Generic (PLEG): container finished" podID="e8336aa7-5445-4d29-a676-e4a588683b09" containerID="9b75d3941160b129bf0617d802c5713478bbad54ec7c636a15259396ff576f92" exitCode=0 Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.476433 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-whfqg" event={"ID":"e8336aa7-5445-4d29-a676-e4a588683b09","Type":"ContainerDied","Data":"9b75d3941160b129bf0617d802c5713478bbad54ec7c636a15259396ff576f92"} Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.482429 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwfmx" event={"ID":"372e5a42-5ade-4cc6-a41f-72d5119f2e64","Type":"ContainerStarted","Data":"7e4f292926042b598ab9fa7ce5e751bf2bd334f7911287d75dfba34dda4dc7e4"} Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.485209 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spm6s" event={"ID":"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe","Type":"ContainerStarted","Data":"601aea5b97a752144af24cf2b77e4d4f9ac401f1230550fad860eb01bab3a91c"} Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.542114 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vwfmx" podStartSLOduration=2.933735178 podStartE2EDuration="5.542094097s" podCreationTimestamp="2026-02-15 17:11:59 +0000 UTC" firstStartedPulling="2026-02-15 17:12:01.382720461 +0000 UTC m=+377.326128593" lastFinishedPulling="2026-02-15 17:12:03.99107938 +0000 UTC m=+379.934487512" observedRunningTime="2026-02-15 17:12:04.536669207 +0000 UTC m=+380.480077349" watchObservedRunningTime="2026-02-15 17:12:04.542094097 +0000 UTC m=+380.485502229" Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.800952 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-xntlx" podUID="54df0770-f532-412f-985c-fa6514121ecf" containerName="registry-server" probeResult="failure" output=< Feb 15 17:12:04 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:12:04 crc kubenswrapper[4585]: > Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.950185 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.950241 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:12:04 crc kubenswrapper[4585]: I0215 17:12:04.991685 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:12:05 crc kubenswrapper[4585]: I0215 17:12:05.493452 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-whfqg" event={"ID":"e8336aa7-5445-4d29-a676-e4a588683b09","Type":"ContainerStarted","Data":"bc614efa64eb6d1e0aaf4ea0145680c582caa256c0b0ad8f9f1f19534c6fc41f"} Feb 15 17:12:05 crc kubenswrapper[4585]: I0215 17:12:05.496039 4585 generic.go:334] "Generic (PLEG): container finished" podID="ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe" containerID="601aea5b97a752144af24cf2b77e4d4f9ac401f1230550fad860eb01bab3a91c" exitCode=0 Feb 15 17:12:05 crc kubenswrapper[4585]: I0215 17:12:05.496083 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spm6s" event={"ID":"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe","Type":"ContainerDied","Data":"601aea5b97a752144af24cf2b77e4d4f9ac401f1230550fad860eb01bab3a91c"} Feb 15 17:12:05 crc kubenswrapper[4585]: I0215 17:12:05.526462 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-whfqg" podStartSLOduration=3.045712346 podStartE2EDuration="5.526444895s" podCreationTimestamp="2026-02-15 17:12:00 +0000 UTC" firstStartedPulling="2026-02-15 17:12:02.435774694 +0000 UTC m=+378.379182826" lastFinishedPulling="2026-02-15 17:12:04.916507243 +0000 UTC m=+380.859915375" observedRunningTime="2026-02-15 17:12:05.525680994 +0000 UTC m=+381.469089126" watchObservedRunningTime="2026-02-15 17:12:05.526444895 +0000 UTC m=+381.469853027" Feb 15 17:12:05 crc kubenswrapper[4585]: I0215 17:12:05.568370 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jp2sv" Feb 15 17:12:05 crc kubenswrapper[4585]: I0215 17:12:05.800152 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" podUID="23788c17-8897-4c56-b718-ebf061e5e15c" containerName="registry" containerID="cri-o://58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219" gracePeriod=30 Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.154141 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.154403 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.300290 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.421562 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"23788c17-8897-4c56-b718-ebf061e5e15c\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.421620 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-registry-certificates\") pod \"23788c17-8897-4c56-b718-ebf061e5e15c\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.421701 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-registry-tls\") pod \"23788c17-8897-4c56-b718-ebf061e5e15c\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.421724 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-bound-sa-token\") pod \"23788c17-8897-4c56-b718-ebf061e5e15c\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.421796 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smt46\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-kube-api-access-smt46\") pod \"23788c17-8897-4c56-b718-ebf061e5e15c\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.421820 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23788c17-8897-4c56-b718-ebf061e5e15c-installation-pull-secrets\") pod \"23788c17-8897-4c56-b718-ebf061e5e15c\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.421893 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23788c17-8897-4c56-b718-ebf061e5e15c-ca-trust-extracted\") pod \"23788c17-8897-4c56-b718-ebf061e5e15c\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.421913 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-trusted-ca\") pod \"23788c17-8897-4c56-b718-ebf061e5e15c\" (UID: \"23788c17-8897-4c56-b718-ebf061e5e15c\") " Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.423519 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "23788c17-8897-4c56-b718-ebf061e5e15c" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.430054 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-kube-api-access-smt46" (OuterVolumeSpecName: "kube-api-access-smt46") pod "23788c17-8897-4c56-b718-ebf061e5e15c" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c"). InnerVolumeSpecName "kube-api-access-smt46". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.441380 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23788c17-8897-4c56-b718-ebf061e5e15c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "23788c17-8897-4c56-b718-ebf061e5e15c" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.441975 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "23788c17-8897-4c56-b718-ebf061e5e15c" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.443026 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23788c17-8897-4c56-b718-ebf061e5e15c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "23788c17-8897-4c56-b718-ebf061e5e15c" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.443255 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "23788c17-8897-4c56-b718-ebf061e5e15c" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.443664 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "23788c17-8897-4c56-b718-ebf061e5e15c" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.459109 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "23788c17-8897-4c56-b718-ebf061e5e15c" (UID: "23788c17-8897-4c56-b718-ebf061e5e15c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.506527 4585 generic.go:334] "Generic (PLEG): container finished" podID="23788c17-8897-4c56-b718-ebf061e5e15c" containerID="58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219" exitCode=0 Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.506631 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" event={"ID":"23788c17-8897-4c56-b718-ebf061e5e15c","Type":"ContainerDied","Data":"58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219"} Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.506669 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" event={"ID":"23788c17-8897-4c56-b718-ebf061e5e15c","Type":"ContainerDied","Data":"d02fd00030cadd18c165fc74fd701cc2ab67d99f991e7cb88ea5d597f2a7b0d0"} Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.506691 4585 scope.go:117] "RemoveContainer" containerID="58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.506836 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-st5w4" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.513859 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spm6s" event={"ID":"ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe","Type":"ContainerStarted","Data":"5c9a08e745be4e5a9b7d862b5340b5a1db891aca5fbb4647faba531de6ce0524"} Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.525435 4585 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.525472 4585 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.525487 4585 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.525499 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smt46\" (UniqueName: \"kubernetes.io/projected/23788c17-8897-4c56-b718-ebf061e5e15c-kube-api-access-smt46\") on node \"crc\" DevicePath \"\"" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.525511 4585 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23788c17-8897-4c56-b718-ebf061e5e15c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.525521 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23788c17-8897-4c56-b718-ebf061e5e15c-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.525533 4585 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23788c17-8897-4c56-b718-ebf061e5e15c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.539712 4585 scope.go:117] "RemoveContainer" containerID="58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219" Feb 15 17:12:06 crc kubenswrapper[4585]: E0215 17:12:06.543068 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219\": container with ID starting with 58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219 not found: ID does not exist" containerID="58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.543163 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219"} err="failed to get container status \"58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219\": rpc error: code = NotFound desc = could not find container \"58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219\": container with ID starting with 58965a54ab7704d6bc88539e56639cc237c5040e704ec85c3338bb9cfcb69219 not found: ID does not exist" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.544566 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-spm6s" podStartSLOduration=3.120072806 podStartE2EDuration="5.544539604s" podCreationTimestamp="2026-02-15 17:12:01 +0000 UTC" firstStartedPulling="2026-02-15 17:12:03.467948601 +0000 UTC m=+379.411356733" lastFinishedPulling="2026-02-15 17:12:05.892415399 +0000 UTC m=+381.835823531" observedRunningTime="2026-02-15 17:12:06.526136477 +0000 UTC m=+382.469544609" watchObservedRunningTime="2026-02-15 17:12:06.544539604 +0000 UTC m=+382.487947736" Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.551935 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-st5w4"] Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.557852 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-st5w4"] Feb 15 17:12:06 crc kubenswrapper[4585]: I0215 17:12:06.850901 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23788c17-8897-4c56-b718-ebf061e5e15c" path="/var/lib/kubelet/pods/23788c17-8897-4c56-b718-ebf061e5e15c/volumes" Feb 15 17:12:07 crc kubenswrapper[4585]: I0215 17:12:07.195131 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9h2zb" podUID="e8137661-759d-4fea-8a45-47e438c9fbe6" containerName="registry-server" probeResult="failure" output=< Feb 15 17:12:07 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:12:07 crc kubenswrapper[4585]: > Feb 15 17:12:07 crc kubenswrapper[4585]: I0215 17:12:07.366871 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:12:07 crc kubenswrapper[4585]: I0215 17:12:07.367012 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:12:08 crc kubenswrapper[4585]: I0215 17:12:08.408243 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-xblvr" podUID="26ccbe5c-984a-4e61-9897-6357cdd14cab" containerName="registry-server" probeResult="failure" output=< Feb 15 17:12:08 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:12:08 crc kubenswrapper[4585]: > Feb 15 17:12:08 crc kubenswrapper[4585]: I0215 17:12:08.520158 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:12:08 crc kubenswrapper[4585]: I0215 17:12:08.520634 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:12:09 crc kubenswrapper[4585]: I0215 17:12:09.576645 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vcnr2" podUID="62d2ce7a-0eba-4d09-8c96-260ff647e3b2" containerName="registry-server" probeResult="failure" output=< Feb 15 17:12:09 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:12:09 crc kubenswrapper[4585]: > Feb 15 17:12:09 crc kubenswrapper[4585]: I0215 17:12:09.683933 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:12:09 crc kubenswrapper[4585]: I0215 17:12:09.683987 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:12:09 crc kubenswrapper[4585]: I0215 17:12:09.749383 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:12:10 crc kubenswrapper[4585]: I0215 17:12:10.615096 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vwfmx" Feb 15 17:12:10 crc kubenswrapper[4585]: I0215 17:12:10.896121 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:10 crc kubenswrapper[4585]: I0215 17:12:10.896288 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:10 crc kubenswrapper[4585]: I0215 17:12:10.940571 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:11 crc kubenswrapper[4585]: I0215 17:12:11.617919 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-whfqg" Feb 15 17:12:12 crc kubenswrapper[4585]: I0215 17:12:12.095515 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:12 crc kubenswrapper[4585]: I0215 17:12:12.095634 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:12 crc kubenswrapper[4585]: I0215 17:12:12.156119 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:12 crc kubenswrapper[4585]: I0215 17:12:12.636707 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-spm6s" Feb 15 17:12:13 crc kubenswrapper[4585]: I0215 17:12:13.808931 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:12:13 crc kubenswrapper[4585]: I0215 17:12:13.876554 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xntlx" Feb 15 17:12:16 crc kubenswrapper[4585]: I0215 17:12:16.204121 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:12:16 crc kubenswrapper[4585]: I0215 17:12:16.262432 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9h2zb" Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.014890 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.014960 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.015014 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.016868 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a387d5dc4956e239f38a2c3f2fecb007c2e9827c50dc04627a92710ff93e588d"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.016947 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://a387d5dc4956e239f38a2c3f2fecb007c2e9827c50dc04627a92710ff93e588d" gracePeriod=600 Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.407986 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.473311 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xblvr" Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.618280 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="a387d5dc4956e239f38a2c3f2fecb007c2e9827c50dc04627a92710ff93e588d" exitCode=0 Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.618665 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"a387d5dc4956e239f38a2c3f2fecb007c2e9827c50dc04627a92710ff93e588d"} Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.618746 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"3be178554d50f4bf9fd8b4ef83fa0f64425cad682ded41b72f52fbeaa156e13e"} Feb 15 17:12:17 crc kubenswrapper[4585]: I0215 17:12:17.618771 4585 scope.go:117] "RemoveContainer" containerID="01583516b128dbe42c3eadf1758d2f6b7a55989afa08f232d2b0458522a31ab4" Feb 15 17:12:18 crc kubenswrapper[4585]: I0215 17:12:18.579752 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:12:18 crc kubenswrapper[4585]: I0215 17:12:18.641703 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vcnr2" Feb 15 17:14:17 crc kubenswrapper[4585]: I0215 17:14:17.014386 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:14:17 crc kubenswrapper[4585]: I0215 17:14:17.015057 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:14:45 crc kubenswrapper[4585]: I0215 17:14:45.216065 4585 scope.go:117] "RemoveContainer" containerID="b72af6a523a88d6dc2a537455e69d7a896b5491f2b1da4b72bc772dc97ccf591" Feb 15 17:14:47 crc kubenswrapper[4585]: I0215 17:14:47.014340 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:14:47 crc kubenswrapper[4585]: I0215 17:14:47.014887 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.412531 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg"] Feb 15 17:14:50 crc kubenswrapper[4585]: E0215 17:14:50.412887 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23788c17-8897-4c56-b718-ebf061e5e15c" containerName="registry" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.412899 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="23788c17-8897-4c56-b718-ebf061e5e15c" containerName="registry" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.413031 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="23788c17-8897-4c56-b718-ebf061e5e15c" containerName="registry" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.413463 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.417564 4585 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-fjmz8" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.417889 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.418061 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.419099 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-gvnsl"] Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.420006 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-gvnsl" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.424511 4585 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-x2prf" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.430049 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg"] Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.443643 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-v8flx"] Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.444635 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.446382 4585 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-chrkc" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.453587 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-gvnsl"] Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.473445 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-v8flx"] Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.500135 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rblwq\" (UniqueName: \"kubernetes.io/projected/260f1bf0-a58d-492f-9939-30c20b324a78-kube-api-access-rblwq\") pod \"cert-manager-858654f9db-gvnsl\" (UID: \"260f1bf0-a58d-492f-9939-30c20b324a78\") " pod="cert-manager/cert-manager-858654f9db-gvnsl" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.500192 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k22n4\" (UniqueName: \"kubernetes.io/projected/a53d4900-9905-4d34-acbf-4ef911683c2c-kube-api-access-k22n4\") pod \"cert-manager-cainjector-cf98fcc89-kqnpg\" (UID: \"a53d4900-9905-4d34-acbf-4ef911683c2c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.500264 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmqjm\" (UniqueName: \"kubernetes.io/projected/ee9f4317-b1fe-44a6-a4df-f14563acc190-kube-api-access-tmqjm\") pod \"cert-manager-webhook-687f57d79b-v8flx\" (UID: \"ee9f4317-b1fe-44a6-a4df-f14563acc190\") " pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.601136 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rblwq\" (UniqueName: \"kubernetes.io/projected/260f1bf0-a58d-492f-9939-30c20b324a78-kube-api-access-rblwq\") pod \"cert-manager-858654f9db-gvnsl\" (UID: \"260f1bf0-a58d-492f-9939-30c20b324a78\") " pod="cert-manager/cert-manager-858654f9db-gvnsl" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.601199 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k22n4\" (UniqueName: \"kubernetes.io/projected/a53d4900-9905-4d34-acbf-4ef911683c2c-kube-api-access-k22n4\") pod \"cert-manager-cainjector-cf98fcc89-kqnpg\" (UID: \"a53d4900-9905-4d34-acbf-4ef911683c2c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.601241 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmqjm\" (UniqueName: \"kubernetes.io/projected/ee9f4317-b1fe-44a6-a4df-f14563acc190-kube-api-access-tmqjm\") pod \"cert-manager-webhook-687f57d79b-v8flx\" (UID: \"ee9f4317-b1fe-44a6-a4df-f14563acc190\") " pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.618186 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rblwq\" (UniqueName: \"kubernetes.io/projected/260f1bf0-a58d-492f-9939-30c20b324a78-kube-api-access-rblwq\") pod \"cert-manager-858654f9db-gvnsl\" (UID: \"260f1bf0-a58d-492f-9939-30c20b324a78\") " pod="cert-manager/cert-manager-858654f9db-gvnsl" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.619188 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k22n4\" (UniqueName: \"kubernetes.io/projected/a53d4900-9905-4d34-acbf-4ef911683c2c-kube-api-access-k22n4\") pod \"cert-manager-cainjector-cf98fcc89-kqnpg\" (UID: \"a53d4900-9905-4d34-acbf-4ef911683c2c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.621323 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmqjm\" (UniqueName: \"kubernetes.io/projected/ee9f4317-b1fe-44a6-a4df-f14563acc190-kube-api-access-tmqjm\") pod \"cert-manager-webhook-687f57d79b-v8flx\" (UID: \"ee9f4317-b1fe-44a6-a4df-f14563acc190\") " pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.755464 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.770578 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-gvnsl" Feb 15 17:14:50 crc kubenswrapper[4585]: I0215 17:14:50.782216 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" Feb 15 17:14:51 crc kubenswrapper[4585]: I0215 17:14:51.024329 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg"] Feb 15 17:14:51 crc kubenswrapper[4585]: I0215 17:14:51.041101 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 15 17:14:51 crc kubenswrapper[4585]: W0215 17:14:51.068764 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee9f4317_b1fe_44a6_a4df_f14563acc190.slice/crio-8500bec92686199c154928b30e9619cb262de1e8cfa93d39266b60999a606eac WatchSource:0}: Error finding container 8500bec92686199c154928b30e9619cb262de1e8cfa93d39266b60999a606eac: Status 404 returned error can't find the container with id 8500bec92686199c154928b30e9619cb262de1e8cfa93d39266b60999a606eac Feb 15 17:14:51 crc kubenswrapper[4585]: I0215 17:14:51.070840 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-v8flx"] Feb 15 17:14:51 crc kubenswrapper[4585]: I0215 17:14:51.121491 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-gvnsl"] Feb 15 17:14:51 crc kubenswrapper[4585]: I0215 17:14:51.211061 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-gvnsl" event={"ID":"260f1bf0-a58d-492f-9939-30c20b324a78","Type":"ContainerStarted","Data":"a52e353d806a7471a4904f1fecfe794fdaae669c72f8d1860ccebc597d07ea16"} Feb 15 17:14:51 crc kubenswrapper[4585]: I0215 17:14:51.213538 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg" event={"ID":"a53d4900-9905-4d34-acbf-4ef911683c2c","Type":"ContainerStarted","Data":"860b0a46be9563f7553254f23cf39b599a5d1a39aa30e405a176119e6921cdfc"} Feb 15 17:14:51 crc kubenswrapper[4585]: I0215 17:14:51.218714 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" event={"ID":"ee9f4317-b1fe-44a6-a4df-f14563acc190","Type":"ContainerStarted","Data":"8500bec92686199c154928b30e9619cb262de1e8cfa93d39266b60999a606eac"} Feb 15 17:14:55 crc kubenswrapper[4585]: I0215 17:14:55.253802 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg" event={"ID":"a53d4900-9905-4d34-acbf-4ef911683c2c","Type":"ContainerStarted","Data":"ce7e32a65c8a6db7763fc133a7e9b544073495781a7002cb88dc84ed822620c2"} Feb 15 17:14:55 crc kubenswrapper[4585]: I0215 17:14:55.255852 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" event={"ID":"ee9f4317-b1fe-44a6-a4df-f14563acc190","Type":"ContainerStarted","Data":"bb76192b6b0bd6c387081dfcf2b6a4b52947692e4012fc6778f6cbcad41b2269"} Feb 15 17:14:55 crc kubenswrapper[4585]: I0215 17:14:55.256479 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" Feb 15 17:14:55 crc kubenswrapper[4585]: I0215 17:14:55.257773 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-gvnsl" event={"ID":"260f1bf0-a58d-492f-9939-30c20b324a78","Type":"ContainerStarted","Data":"8b9cdc892f6e0aed693912e0c7a8d6dcb8fdd0bd4acada249a275d4773fe2c5b"} Feb 15 17:14:55 crc kubenswrapper[4585]: I0215 17:14:55.272460 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kqnpg" podStartSLOduration=1.45690203 podStartE2EDuration="5.272443272s" podCreationTimestamp="2026-02-15 17:14:50 +0000 UTC" firstStartedPulling="2026-02-15 17:14:51.040876178 +0000 UTC m=+546.984284310" lastFinishedPulling="2026-02-15 17:14:54.85641742 +0000 UTC m=+550.799825552" observedRunningTime="2026-02-15 17:14:55.271958418 +0000 UTC m=+551.215366550" watchObservedRunningTime="2026-02-15 17:14:55.272443272 +0000 UTC m=+551.215851404" Feb 15 17:14:55 crc kubenswrapper[4585]: I0215 17:14:55.314155 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" podStartSLOduration=1.531677414 podStartE2EDuration="5.314139832s" podCreationTimestamp="2026-02-15 17:14:50 +0000 UTC" firstStartedPulling="2026-02-15 17:14:51.071137874 +0000 UTC m=+547.014546006" lastFinishedPulling="2026-02-15 17:14:54.853600262 +0000 UTC m=+550.797008424" observedRunningTime="2026-02-15 17:14:55.312043285 +0000 UTC m=+551.255451407" watchObservedRunningTime="2026-02-15 17:14:55.314139832 +0000 UTC m=+551.257547964" Feb 15 17:14:55 crc kubenswrapper[4585]: I0215 17:14:55.316337 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-gvnsl" podStartSLOduration=1.477975861 podStartE2EDuration="5.316329722s" podCreationTimestamp="2026-02-15 17:14:50 +0000 UTC" firstStartedPulling="2026-02-15 17:14:51.121399951 +0000 UTC m=+547.064808083" lastFinishedPulling="2026-02-15 17:14:54.959753812 +0000 UTC m=+550.903161944" observedRunningTime="2026-02-15 17:14:55.292884296 +0000 UTC m=+551.236292428" watchObservedRunningTime="2026-02-15 17:14:55.316329722 +0000 UTC m=+551.259737854" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.225705 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf"] Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.228814 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.240361 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf"] Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.242222 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.242997 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.385343 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11020ade-99c4-480e-bb24-5e0051a2b870-secret-volume\") pod \"collect-profiles-29519595-b2ljf\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.385412 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11020ade-99c4-480e-bb24-5e0051a2b870-config-volume\") pod \"collect-profiles-29519595-b2ljf\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.385629 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbp9t\" (UniqueName: \"kubernetes.io/projected/11020ade-99c4-480e-bb24-5e0051a2b870-kube-api-access-nbp9t\") pod \"collect-profiles-29519595-b2ljf\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.487633 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11020ade-99c4-480e-bb24-5e0051a2b870-config-volume\") pod \"collect-profiles-29519595-b2ljf\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.487747 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbp9t\" (UniqueName: \"kubernetes.io/projected/11020ade-99c4-480e-bb24-5e0051a2b870-kube-api-access-nbp9t\") pod \"collect-profiles-29519595-b2ljf\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.487905 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11020ade-99c4-480e-bb24-5e0051a2b870-secret-volume\") pod \"collect-profiles-29519595-b2ljf\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.489859 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11020ade-99c4-480e-bb24-5e0051a2b870-config-volume\") pod \"collect-profiles-29519595-b2ljf\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.503477 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11020ade-99c4-480e-bb24-5e0051a2b870-secret-volume\") pod \"collect-profiles-29519595-b2ljf\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.522046 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbp9t\" (UniqueName: \"kubernetes.io/projected/11020ade-99c4-480e-bb24-5e0051a2b870-kube-api-access-nbp9t\") pod \"collect-profiles-29519595-b2ljf\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.560145 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.625291 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vp6tl"] Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.625918 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="nbdb" containerID="cri-o://f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343" gracePeriod=30 Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.626388 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="sbdb" containerID="cri-o://0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364" gracePeriod=30 Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.626473 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb" gracePeriod=30 Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.626522 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="northd" containerID="cri-o://7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf" gracePeriod=30 Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.626574 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kube-rbac-proxy-node" containerID="cri-o://70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3" gracePeriod=30 Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.628322 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovn-acl-logging" containerID="cri-o://de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd" gracePeriod=30 Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.633126 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovn-controller" containerID="cri-o://c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d" gracePeriod=30 Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.693862 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" containerID="cri-o://8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81" gracePeriod=30 Feb 15 17:15:00 crc kubenswrapper[4585]: E0215 17:15:00.739936 4585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c2bec047a60dcfcbf71f73a80058f9e36f15c89092cfcfae8cb1e3cc4335db38): error adding pod openshift-operator-lifecycle-manager_collect-profiles-29519595-b2ljf to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" Feb 15 17:15:00 crc kubenswrapper[4585]: E0215 17:15:00.740005 4585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c2bec047a60dcfcbf71f73a80058f9e36f15c89092cfcfae8cb1e3cc4335db38): error adding pod openshift-operator-lifecycle-manager_collect-profiles-29519595-b2ljf to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: E0215 17:15:00.740027 4585 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c2bec047a60dcfcbf71f73a80058f9e36f15c89092cfcfae8cb1e3cc4335db38): error adding pod openshift-operator-lifecycle-manager_collect-profiles-29519595-b2ljf to CNI network \"multus-cni-network\": plugin type=\"multus-shim\" name=\"multus-cni-network\" failed (add): CmdAdd (shim): failed to send CNI request: Post \"http://dummy/cni\": EOF: StdinData: {\"binDir\":\"/var/lib/cni/bin\",\"clusterNetwork\":\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\",\"cniVersion\":\"0.3.1\",\"daemonSocketDir\":\"/run/multus/socket\",\"globalNamespaces\":\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\",\"logLevel\":\"verbose\",\"logToStderr\":true,\"name\":\"multus-cni-network\",\"namespaceIsolation\":true,\"type\":\"multus-shim\"}" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:00 crc kubenswrapper[4585]: E0215 17:15:00.740074 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager(11020ade-99c4-480e-bb24-5e0051a2b870)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager(11020ade-99c4-480e-bb24-5e0051a2b870)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c2bec047a60dcfcbf71f73a80058f9e36f15c89092cfcfae8cb1e3cc4335db38): error adding pod openshift-operator-lifecycle-manager_collect-profiles-29519595-b2ljf to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): failed to send CNI request: Post \\\"http://dummy/cni\\\": EOF: StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" podUID="11020ade-99c4-480e-bb24-5e0051a2b870" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.787289 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-v8flx" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.991183 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/3.log" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.994432 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovn-acl-logging/0.log" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.995259 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovn-controller/0.log" Feb 15 17:15:00 crc kubenswrapper[4585]: I0215 17:15:00.995818 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.057959 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2hkkw"] Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058251 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kube-rbac-proxy-node" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058263 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kube-rbac-proxy-node" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058289 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovn-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058297 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovn-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058310 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058320 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058328 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058336 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058343 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058352 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058361 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kube-rbac-proxy-ovn-metrics" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058368 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kube-rbac-proxy-ovn-metrics" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058381 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="nbdb" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058388 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="nbdb" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058416 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="northd" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058424 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="northd" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058438 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058445 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058460 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kubecfg-setup" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058467 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kubecfg-setup" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058474 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovn-acl-logging" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058480 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovn-acl-logging" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058491 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="sbdb" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058497 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="sbdb" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058644 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kube-rbac-proxy-node" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058658 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovn-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058666 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="kube-rbac-proxy-ovn-metrics" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058672 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058682 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="nbdb" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058691 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058697 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058703 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="northd" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058712 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058720 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovn-acl-logging" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058731 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="sbdb" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.058873 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.058880 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.059002 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerName="ovnkube-controller" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.061143 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.196506 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-ovn\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.196584 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-config\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.196629 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-netd\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.196665 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.196692 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.196802 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-kubelet\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.196862 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-script-lib\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.196927 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-openvswitch\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.196967 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197001 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-env-overrides\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197042 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-node-log\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197076 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197105 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-var-lib-openvswitch\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197119 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197135 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-systemd-units\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197153 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197167 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-bin\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197200 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-node-log" (OuterVolumeSpecName: "node-log") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197206 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5acdc04-0978-4907-bd9e-965400ded9bf-ovn-node-metrics-cert\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197226 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197235 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-slash\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197267 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-systemd\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197298 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-etc-openvswitch\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197337 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-netns\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197389 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-ovn-kubernetes\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197420 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-log-socket\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197455 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h74xg\" (UniqueName: \"kubernetes.io/projected/e5acdc04-0978-4907-bd9e-965400ded9bf-kube-api-access-h74xg\") pod \"e5acdc04-0978-4907-bd9e-965400ded9bf\" (UID: \"e5acdc04-0978-4907-bd9e-965400ded9bf\") " Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197503 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197542 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-slash" (OuterVolumeSpecName: "host-slash") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197564 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197593 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197647 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197681 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197752 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-systemd-units\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197771 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197798 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-run-systemd\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197820 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-log-socket" (OuterVolumeSpecName: "log-socket") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197827 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-run-ovn\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197848 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197867 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-run-openvswitch\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197876 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197929 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-node-log\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.197982 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-ovn-node-metrics-cert\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198088 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-etc-openvswitch\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198149 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-ovnkube-config\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198245 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198321 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-run-netns\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198368 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-var-lib-openvswitch\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198439 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-cni-bin\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198467 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-ovnkube-script-lib\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198497 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-env-overrides\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198552 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198659 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-cni-netd\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198710 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-log-socket\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198760 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-kubelet\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198811 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-slash\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198843 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqb7z\" (UniqueName: \"kubernetes.io/projected/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-kube-api-access-fqb7z\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198908 4585 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198927 4585 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198946 4585 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198965 4585 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-slash\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198981 4585 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.198997 4585 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199015 4585 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199032 4585 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-log-socket\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199049 4585 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199067 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199083 4585 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199100 4585 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199116 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199132 4585 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199150 4585 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199168 4585 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5acdc04-0978-4907-bd9e-965400ded9bf-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.199185 4585 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-node-log\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.202049 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5acdc04-0978-4907-bd9e-965400ded9bf-kube-api-access-h74xg" (OuterVolumeSpecName: "kube-api-access-h74xg") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "kube-api-access-h74xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.202313 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5acdc04-0978-4907-bd9e-965400ded9bf-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.225308 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e5acdc04-0978-4907-bd9e-965400ded9bf" (UID: "e5acdc04-0978-4907-bd9e-965400ded9bf"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300359 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-kubelet\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300399 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-slash\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300419 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqb7z\" (UniqueName: \"kubernetes.io/projected/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-kube-api-access-fqb7z\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300437 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-systemd-units\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300450 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-run-systemd\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300466 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-run-ovn\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300486 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-run-openvswitch\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300500 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-node-log\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300521 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-ovn-node-metrics-cert\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300540 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-etc-openvswitch\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300568 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-ovnkube-config\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300620 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300646 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-run-netns\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300662 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-var-lib-openvswitch\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300689 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-cni-bin\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300715 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-ovnkube-script-lib\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300736 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-env-overrides\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300755 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300773 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-cni-netd\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300796 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-log-socket\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300845 4585 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5acdc04-0978-4907-bd9e-965400ded9bf-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300859 4585 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5acdc04-0978-4907-bd9e-965400ded9bf-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300872 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h74xg\" (UniqueName: \"kubernetes.io/projected/e5acdc04-0978-4907-bd9e-965400ded9bf-kube-api-access-h74xg\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300918 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-log-socket\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300958 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-kubelet\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.300987 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-slash\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.301334 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-systemd-units\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.302997 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-run-systemd\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303039 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-run-ovn\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303086 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-run-openvswitch\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303105 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-node-log\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303499 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-var-lib-openvswitch\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303533 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-etc-openvswitch\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303563 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-run-netns\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303613 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303591 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303651 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-cni-netd\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.303681 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-host-cni-bin\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.304066 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-env-overrides\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.304652 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-ovnkube-config\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.304763 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-ovnkube-script-lib\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.311823 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-ovn-node-metrics-cert\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.318449 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqb7z\" (UniqueName: \"kubernetes.io/projected/e43648d5-c7db-40e4-95d9-c9eab71bf9a9-kube-api-access-fqb7z\") pod \"ovnkube-node-2hkkw\" (UID: \"e43648d5-c7db-40e4-95d9-c9eab71bf9a9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.325247 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovnkube-controller/3.log" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.327960 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovn-acl-logging/0.log" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.328672 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vp6tl_e5acdc04-0978-4907-bd9e-965400ded9bf/ovn-controller/0.log" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329297 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81" exitCode=0 Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329332 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364" exitCode=0 Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329347 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343" exitCode=0 Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329360 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf" exitCode=0 Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329369 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329392 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329438 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329457 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329373 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb" exitCode=0 Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329495 4585 scope.go:117] "RemoveContainer" containerID="8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329509 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3" exitCode=0 Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329539 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd" exitCode=143 Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329555 4585 generic.go:334] "Generic (PLEG): container finished" podID="e5acdc04-0978-4907-bd9e-965400ded9bf" containerID="c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d" exitCode=143 Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329479 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329683 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329711 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329733 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329751 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329763 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329774 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329785 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329798 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329809 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329820 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329830 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329845 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329862 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329875 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329886 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329898 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329908 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329919 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329931 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329942 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329952 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329963 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329977 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.329992 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330005 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330016 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330026 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330038 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330048 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330058 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330069 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330079 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330089 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330104 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp6tl" event={"ID":"e5acdc04-0978-4907-bd9e-965400ded9bf","Type":"ContainerDied","Data":"d7ea8bc1b3f2011951fbe647194ae6634724839334ed4aa0d1e0449bf596d5bd"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330119 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330131 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330142 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330154 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330165 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330176 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330187 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330198 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330209 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.330220 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.335189 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/2.log" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.335879 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/1.log" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.335929 4585 generic.go:334] "Generic (PLEG): container finished" podID="70645395-8d49-4495-a647-b6d43206ecbc" containerID="14196f60e816bd7337ce1fe272a79514ddd5bfacb4a1106cdf6530c16feaf6ed" exitCode=2 Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.336001 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.336679 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.337019 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4ps2" event={"ID":"70645395-8d49-4495-a647-b6d43206ecbc","Type":"ContainerDied","Data":"14196f60e816bd7337ce1fe272a79514ddd5bfacb4a1106cdf6530c16feaf6ed"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.337050 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8"} Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.337486 4585 scope.go:117] "RemoveContainer" containerID="14196f60e816bd7337ce1fe272a79514ddd5bfacb4a1106cdf6530c16feaf6ed" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.337807 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-n4ps2_openshift-multus(70645395-8d49-4495-a647-b6d43206ecbc)\"" pod="openshift-multus/multus-n4ps2" podUID="70645395-8d49-4495-a647-b6d43206ecbc" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.376185 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.377761 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.400980 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vp6tl"] Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.405044 4585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(e3183db07f48f8d5a06f7af5c56e7b311fd553f6962c212528c0de8d33b4b9d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.405102 4585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(e3183db07f48f8d5a06f7af5c56e7b311fd553f6962c212528c0de8d33b4b9d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.405125 4585 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(e3183db07f48f8d5a06f7af5c56e7b311fd553f6962c212528c0de8d33b4b9d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.405174 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager(11020ade-99c4-480e-bb24-5e0051a2b870)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager(11020ade-99c4-480e-bb24-5e0051a2b870)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(e3183db07f48f8d5a06f7af5c56e7b311fd553f6962c212528c0de8d33b4b9d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" podUID="11020ade-99c4-480e-bb24-5e0051a2b870" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.414286 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vp6tl"] Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.425152 4585 scope.go:117] "RemoveContainer" containerID="0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.438804 4585 scope.go:117] "RemoveContainer" containerID="f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.455568 4585 scope.go:117] "RemoveContainer" containerID="7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.474181 4585 scope.go:117] "RemoveContainer" containerID="af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.489032 4585 scope.go:117] "RemoveContainer" containerID="70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.513615 4585 scope.go:117] "RemoveContainer" containerID="de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.578483 4585 scope.go:117] "RemoveContainer" containerID="c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.605624 4585 scope.go:117] "RemoveContainer" containerID="911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.629373 4585 scope.go:117] "RemoveContainer" containerID="8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.630181 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": container with ID starting with 8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81 not found: ID does not exist" containerID="8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.630227 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} err="failed to get container status \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": rpc error: code = NotFound desc = could not find container \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": container with ID starting with 8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.630257 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.630833 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\": container with ID starting with f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164 not found: ID does not exist" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.630865 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} err="failed to get container status \"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\": rpc error: code = NotFound desc = could not find container \"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\": container with ID starting with f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.630882 4585 scope.go:117] "RemoveContainer" containerID="0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.631397 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\": container with ID starting with 0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364 not found: ID does not exist" containerID="0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.631413 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} err="failed to get container status \"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\": rpc error: code = NotFound desc = could not find container \"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\": container with ID starting with 0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.631425 4585 scope.go:117] "RemoveContainer" containerID="f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.632094 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\": container with ID starting with f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343 not found: ID does not exist" containerID="f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.632122 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} err="failed to get container status \"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\": rpc error: code = NotFound desc = could not find container \"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\": container with ID starting with f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.632138 4585 scope.go:117] "RemoveContainer" containerID="7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.633138 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\": container with ID starting with 7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf not found: ID does not exist" containerID="7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.633163 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} err="failed to get container status \"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\": rpc error: code = NotFound desc = could not find container \"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\": container with ID starting with 7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.633175 4585 scope.go:117] "RemoveContainer" containerID="af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.633557 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\": container with ID starting with af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb not found: ID does not exist" containerID="af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.633585 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} err="failed to get container status \"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\": rpc error: code = NotFound desc = could not find container \"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\": container with ID starting with af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.633618 4585 scope.go:117] "RemoveContainer" containerID="70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.633977 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\": container with ID starting with 70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3 not found: ID does not exist" containerID="70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.634015 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} err="failed to get container status \"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\": rpc error: code = NotFound desc = could not find container \"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\": container with ID starting with 70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.634033 4585 scope.go:117] "RemoveContainer" containerID="de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.634363 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\": container with ID starting with de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd not found: ID does not exist" containerID="de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.634428 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} err="failed to get container status \"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\": rpc error: code = NotFound desc = could not find container \"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\": container with ID starting with de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.634460 4585 scope.go:117] "RemoveContainer" containerID="c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.634952 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\": container with ID starting with c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d not found: ID does not exist" containerID="c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.635011 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} err="failed to get container status \"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\": rpc error: code = NotFound desc = could not find container \"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\": container with ID starting with c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.635051 4585 scope.go:117] "RemoveContainer" containerID="911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe" Feb 15 17:15:01 crc kubenswrapper[4585]: E0215 17:15:01.635465 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\": container with ID starting with 911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe not found: ID does not exist" containerID="911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.635504 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe"} err="failed to get container status \"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\": rpc error: code = NotFound desc = could not find container \"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\": container with ID starting with 911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.635527 4585 scope.go:117] "RemoveContainer" containerID="8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.635917 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} err="failed to get container status \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": rpc error: code = NotFound desc = could not find container \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": container with ID starting with 8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.635949 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.636389 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} err="failed to get container status \"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\": rpc error: code = NotFound desc = could not find container \"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\": container with ID starting with f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.636421 4585 scope.go:117] "RemoveContainer" containerID="0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.636753 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} err="failed to get container status \"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\": rpc error: code = NotFound desc = could not find container \"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\": container with ID starting with 0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.636778 4585 scope.go:117] "RemoveContainer" containerID="f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.637067 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} err="failed to get container status \"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\": rpc error: code = NotFound desc = could not find container \"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\": container with ID starting with f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.637105 4585 scope.go:117] "RemoveContainer" containerID="7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.637397 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} err="failed to get container status \"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\": rpc error: code = NotFound desc = could not find container \"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\": container with ID starting with 7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.637451 4585 scope.go:117] "RemoveContainer" containerID="af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.637791 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} err="failed to get container status \"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\": rpc error: code = NotFound desc = could not find container \"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\": container with ID starting with af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.637815 4585 scope.go:117] "RemoveContainer" containerID="70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.638059 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} err="failed to get container status \"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\": rpc error: code = NotFound desc = could not find container \"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\": container with ID starting with 70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.638092 4585 scope.go:117] "RemoveContainer" containerID="de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.638443 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} err="failed to get container status \"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\": rpc error: code = NotFound desc = could not find container \"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\": container with ID starting with de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.638471 4585 scope.go:117] "RemoveContainer" containerID="c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.638823 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} err="failed to get container status \"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\": rpc error: code = NotFound desc = could not find container \"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\": container with ID starting with c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.638850 4585 scope.go:117] "RemoveContainer" containerID="911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.639103 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe"} err="failed to get container status \"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\": rpc error: code = NotFound desc = could not find container \"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\": container with ID starting with 911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.639135 4585 scope.go:117] "RemoveContainer" containerID="8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.639431 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} err="failed to get container status \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": rpc error: code = NotFound desc = could not find container \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": container with ID starting with 8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.639491 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.639790 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} err="failed to get container status \"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\": rpc error: code = NotFound desc = could not find container \"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\": container with ID starting with f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.639815 4585 scope.go:117] "RemoveContainer" containerID="0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.640054 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} err="failed to get container status \"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\": rpc error: code = NotFound desc = could not find container \"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\": container with ID starting with 0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.640088 4585 scope.go:117] "RemoveContainer" containerID="f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.640376 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} err="failed to get container status \"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\": rpc error: code = NotFound desc = could not find container \"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\": container with ID starting with f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.640395 4585 scope.go:117] "RemoveContainer" containerID="7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.640655 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} err="failed to get container status \"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\": rpc error: code = NotFound desc = could not find container \"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\": container with ID starting with 7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.640679 4585 scope.go:117] "RemoveContainer" containerID="af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.640952 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} err="failed to get container status \"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\": rpc error: code = NotFound desc = could not find container \"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\": container with ID starting with af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.640976 4585 scope.go:117] "RemoveContainer" containerID="70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.641247 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} err="failed to get container status \"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\": rpc error: code = NotFound desc = could not find container \"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\": container with ID starting with 70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.641281 4585 scope.go:117] "RemoveContainer" containerID="de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.641512 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} err="failed to get container status \"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\": rpc error: code = NotFound desc = could not find container \"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\": container with ID starting with de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.641531 4585 scope.go:117] "RemoveContainer" containerID="c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.642841 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} err="failed to get container status \"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\": rpc error: code = NotFound desc = could not find container \"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\": container with ID starting with c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.642862 4585 scope.go:117] "RemoveContainer" containerID="911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.643099 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe"} err="failed to get container status \"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\": rpc error: code = NotFound desc = could not find container \"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\": container with ID starting with 911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.643126 4585 scope.go:117] "RemoveContainer" containerID="8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.643395 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} err="failed to get container status \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": rpc error: code = NotFound desc = could not find container \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": container with ID starting with 8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.643418 4585 scope.go:117] "RemoveContainer" containerID="f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.643713 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164"} err="failed to get container status \"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\": rpc error: code = NotFound desc = could not find container \"f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164\": container with ID starting with f03e24b3d2575ff32cb184d7133a8bda5eb280083ed290284293b8d870d73164 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.643738 4585 scope.go:117] "RemoveContainer" containerID="0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.643940 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364"} err="failed to get container status \"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\": rpc error: code = NotFound desc = could not find container \"0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364\": container with ID starting with 0abd735738863ada26a5e116ba22d86d9f62c6e591ef1172463395877386c364 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.643970 4585 scope.go:117] "RemoveContainer" containerID="f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.644246 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343"} err="failed to get container status \"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\": rpc error: code = NotFound desc = could not find container \"f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343\": container with ID starting with f633d7b78b7835a39425d7d34c9623fc5ebf4b33ffbe8571088b2640fe88a343 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.644268 4585 scope.go:117] "RemoveContainer" containerID="7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.644549 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf"} err="failed to get container status \"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\": rpc error: code = NotFound desc = could not find container \"7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf\": container with ID starting with 7d65c583aa3ddf5bdd293d1c776f6be924ee18af8f5e00ba37e057e93f293eaf not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.644575 4585 scope.go:117] "RemoveContainer" containerID="af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.644891 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb"} err="failed to get container status \"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\": rpc error: code = NotFound desc = could not find container \"af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb\": container with ID starting with af56cc3debd2c1499568f74f080e8d14941e77ad35f9f8cd62b9dfd999c25dfb not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.644926 4585 scope.go:117] "RemoveContainer" containerID="70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.645312 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3"} err="failed to get container status \"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\": rpc error: code = NotFound desc = could not find container \"70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3\": container with ID starting with 70a040aebafbf3f5b269e95cfdef817dded58bb1aeefcf1418b35694b87505f3 not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.645337 4585 scope.go:117] "RemoveContainer" containerID="de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.645546 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd"} err="failed to get container status \"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\": rpc error: code = NotFound desc = could not find container \"de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd\": container with ID starting with de321a689cba25e85e309427e03bf259f77a340dca15fffe6594134f31e4adbd not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.645568 4585 scope.go:117] "RemoveContainer" containerID="c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.645808 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d"} err="failed to get container status \"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\": rpc error: code = NotFound desc = could not find container \"c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d\": container with ID starting with c7850d8f75b39f735d9119da08917c1436da4f2d78fddaf7d60e4e188b7fa42d not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.645832 4585 scope.go:117] "RemoveContainer" containerID="911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.646087 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe"} err="failed to get container status \"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\": rpc error: code = NotFound desc = could not find container \"911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe\": container with ID starting with 911e7b7ef5607be04d8a298f1febf2460a9fc1bf285dc2f79b04bbb5109d0ebe not found: ID does not exist" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.646125 4585 scope.go:117] "RemoveContainer" containerID="8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81" Feb 15 17:15:01 crc kubenswrapper[4585]: I0215 17:15:01.646442 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81"} err="failed to get container status \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": rpc error: code = NotFound desc = could not find container \"8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81\": container with ID starting with 8944f31f43f511e08363a8017da34892e94e81b4b43f6bf2f5ef2ab9ebdb4b81 not found: ID does not exist" Feb 15 17:15:02 crc kubenswrapper[4585]: I0215 17:15:02.348716 4585 generic.go:334] "Generic (PLEG): container finished" podID="e43648d5-c7db-40e4-95d9-c9eab71bf9a9" containerID="9772b75b042753fd7feca8829696bcbc710a86a65322f0f5f219851ebe9b2e78" exitCode=0 Feb 15 17:15:02 crc kubenswrapper[4585]: I0215 17:15:02.348768 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerDied","Data":"9772b75b042753fd7feca8829696bcbc710a86a65322f0f5f219851ebe9b2e78"} Feb 15 17:15:02 crc kubenswrapper[4585]: I0215 17:15:02.348800 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerStarted","Data":"3ddc00e513504942a4d759317adfe3deab4ac1f127131155cd863ebd9b694aa0"} Feb 15 17:15:02 crc kubenswrapper[4585]: I0215 17:15:02.853404 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5acdc04-0978-4907-bd9e-965400ded9bf" path="/var/lib/kubelet/pods/e5acdc04-0978-4907-bd9e-965400ded9bf/volumes" Feb 15 17:15:03 crc kubenswrapper[4585]: I0215 17:15:03.363399 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerStarted","Data":"395fd074a1046c06f6ee0a2a95da58227680afc17a7e163e784458f634484ae9"} Feb 15 17:15:03 crc kubenswrapper[4585]: I0215 17:15:03.363763 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerStarted","Data":"1c198373a6e884b6cf41305cf95fe2bddbde81c716e4b7acc255fc1ceccc79ed"} Feb 15 17:15:03 crc kubenswrapper[4585]: I0215 17:15:03.363786 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerStarted","Data":"a01134527496c35ff979f76f443256de8b27b5519ef9bd99ee1d91a44923a9f3"} Feb 15 17:15:03 crc kubenswrapper[4585]: I0215 17:15:03.363809 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerStarted","Data":"40bee5033e77c0f8e982818bb91bea9d509c19895e8d2cf0b65a9213708bc0f4"} Feb 15 17:15:03 crc kubenswrapper[4585]: I0215 17:15:03.363826 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerStarted","Data":"bc58f0512e7df1d40697c049d76dca7aff96e3f891219064f0a7dfc05efa78b5"} Feb 15 17:15:03 crc kubenswrapper[4585]: I0215 17:15:03.363843 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerStarted","Data":"13b5a209651485295419afe919ddc39440861a0b64c74b2d5845e6dafcc2e7ca"} Feb 15 17:15:06 crc kubenswrapper[4585]: I0215 17:15:06.396865 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerStarted","Data":"e1657a3e58239bf9c93e350e51a5577a9ec7ae1209ec118ac9e9dde13a2df32d"} Feb 15 17:15:08 crc kubenswrapper[4585]: I0215 17:15:08.417410 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" event={"ID":"e43648d5-c7db-40e4-95d9-c9eab71bf9a9","Type":"ContainerStarted","Data":"b175e1a6cc1a1029abd389bbf65516a7892de3cc0d149e0b6807c9701317ea59"} Feb 15 17:15:08 crc kubenswrapper[4585]: I0215 17:15:08.418520 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:08 crc kubenswrapper[4585]: I0215 17:15:08.418538 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:08 crc kubenswrapper[4585]: I0215 17:15:08.418547 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:08 crc kubenswrapper[4585]: I0215 17:15:08.446730 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" podStartSLOduration=7.446716295 podStartE2EDuration="7.446716295s" podCreationTimestamp="2026-02-15 17:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:15:08.44614494 +0000 UTC m=+564.389553092" watchObservedRunningTime="2026-02-15 17:15:08.446716295 +0000 UTC m=+564.390124427" Feb 15 17:15:08 crc kubenswrapper[4585]: I0215 17:15:08.455833 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:08 crc kubenswrapper[4585]: I0215 17:15:08.467772 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:15 crc kubenswrapper[4585]: I0215 17:15:15.841787 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:15 crc kubenswrapper[4585]: I0215 17:15:15.843197 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:15 crc kubenswrapper[4585]: E0215 17:15:15.938682 4585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c38258c0269aaa0ec2b799792120823e3f80e370cc9141fbacd39b8d61c70136): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 15 17:15:15 crc kubenswrapper[4585]: E0215 17:15:15.938760 4585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c38258c0269aaa0ec2b799792120823e3f80e370cc9141fbacd39b8d61c70136): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:15 crc kubenswrapper[4585]: E0215 17:15:15.938785 4585 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c38258c0269aaa0ec2b799792120823e3f80e370cc9141fbacd39b8d61c70136): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:15 crc kubenswrapper[4585]: E0215 17:15:15.938836 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager(11020ade-99c4-480e-bb24-5e0051a2b870)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager(11020ade-99c4-480e-bb24-5e0051a2b870)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c38258c0269aaa0ec2b799792120823e3f80e370cc9141fbacd39b8d61c70136): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" podUID="11020ade-99c4-480e-bb24-5e0051a2b870" Feb 15 17:15:16 crc kubenswrapper[4585]: I0215 17:15:16.841697 4585 scope.go:117] "RemoveContainer" containerID="14196f60e816bd7337ce1fe272a79514ddd5bfacb4a1106cdf6530c16feaf6ed" Feb 15 17:15:16 crc kubenswrapper[4585]: E0215 17:15:16.842046 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-n4ps2_openshift-multus(70645395-8d49-4495-a647-b6d43206ecbc)\"" pod="openshift-multus/multus-n4ps2" podUID="70645395-8d49-4495-a647-b6d43206ecbc" Feb 15 17:15:17 crc kubenswrapper[4585]: I0215 17:15:17.014026 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:15:17 crc kubenswrapper[4585]: I0215 17:15:17.014110 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:15:17 crc kubenswrapper[4585]: I0215 17:15:17.014174 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:15:17 crc kubenswrapper[4585]: I0215 17:15:17.015155 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3be178554d50f4bf9fd8b4ef83fa0f64425cad682ded41b72f52fbeaa156e13e"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:15:17 crc kubenswrapper[4585]: I0215 17:15:17.015253 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://3be178554d50f4bf9fd8b4ef83fa0f64425cad682ded41b72f52fbeaa156e13e" gracePeriod=600 Feb 15 17:15:17 crc kubenswrapper[4585]: I0215 17:15:17.900665 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="3be178554d50f4bf9fd8b4ef83fa0f64425cad682ded41b72f52fbeaa156e13e" exitCode=0 Feb 15 17:15:17 crc kubenswrapper[4585]: I0215 17:15:17.901020 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"3be178554d50f4bf9fd8b4ef83fa0f64425cad682ded41b72f52fbeaa156e13e"} Feb 15 17:15:17 crc kubenswrapper[4585]: I0215 17:15:17.901061 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"66bd3998ff2493b6c4431c56b818df1c025a69a1e07091de641e0ebe4853beee"} Feb 15 17:15:17 crc kubenswrapper[4585]: I0215 17:15:17.901145 4585 scope.go:117] "RemoveContainer" containerID="a387d5dc4956e239f38a2c3f2fecb007c2e9827c50dc04627a92710ff93e588d" Feb 15 17:15:29 crc kubenswrapper[4585]: I0215 17:15:29.841142 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:29 crc kubenswrapper[4585]: I0215 17:15:29.843061 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:29 crc kubenswrapper[4585]: E0215 17:15:29.902145 4585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c59fec70514afa181a483572de430faa5b1119886c65f01f3fbbebe86b1681f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 15 17:15:29 crc kubenswrapper[4585]: E0215 17:15:29.902254 4585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c59fec70514afa181a483572de430faa5b1119886c65f01f3fbbebe86b1681f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:29 crc kubenswrapper[4585]: E0215 17:15:29.902321 4585 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c59fec70514afa181a483572de430faa5b1119886c65f01f3fbbebe86b1681f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:29 crc kubenswrapper[4585]: E0215 17:15:29.902395 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager(11020ade-99c4-480e-bb24-5e0051a2b870)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager(11020ade-99c4-480e-bb24-5e0051a2b870)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29519595-b2ljf_openshift-operator-lifecycle-manager_11020ade-99c4-480e-bb24-5e0051a2b870_0(c59fec70514afa181a483572de430faa5b1119886c65f01f3fbbebe86b1681f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" podUID="11020ade-99c4-480e-bb24-5e0051a2b870" Feb 15 17:15:31 crc kubenswrapper[4585]: I0215 17:15:31.427405 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2hkkw" Feb 15 17:15:31 crc kubenswrapper[4585]: I0215 17:15:31.842520 4585 scope.go:117] "RemoveContainer" containerID="14196f60e816bd7337ce1fe272a79514ddd5bfacb4a1106cdf6530c16feaf6ed" Feb 15 17:15:33 crc kubenswrapper[4585]: I0215 17:15:33.051497 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/2.log" Feb 15 17:15:33 crc kubenswrapper[4585]: I0215 17:15:33.052781 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/1.log" Feb 15 17:15:33 crc kubenswrapper[4585]: I0215 17:15:33.052869 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4ps2" event={"ID":"70645395-8d49-4495-a647-b6d43206ecbc","Type":"ContainerStarted","Data":"aeb549212a07f44b6538746e9edf62c295a75f561a9d0a52b417ae43e6b34408"} Feb 15 17:15:38 crc kubenswrapper[4585]: I0215 17:15:38.905307 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk"] Feb 15 17:15:38 crc kubenswrapper[4585]: I0215 17:15:38.907914 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:38 crc kubenswrapper[4585]: W0215 17:15:38.911866 4585 reflector.go:561] object-"openshift-marketplace"/"default-dockercfg-vmwhc": failed to list *v1.Secret: secrets "default-dockercfg-vmwhc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Feb 15 17:15:38 crc kubenswrapper[4585]: E0215 17:15:38.911904 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"default-dockercfg-vmwhc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-vmwhc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 15 17:15:38 crc kubenswrapper[4585]: I0215 17:15:38.931445 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk"] Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.093072 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.093139 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9gv\" (UniqueName: \"kubernetes.io/projected/ecd88680-0e48-4c4f-9568-9f63dcd3e127-kube-api-access-gn9gv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.093201 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.195060 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.195269 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.195328 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9gv\" (UniqueName: \"kubernetes.io/projected/ecd88680-0e48-4c4f-9568-9f63dcd3e127-kube-api-access-gn9gv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.195910 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.196705 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.221249 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9gv\" (UniqueName: \"kubernetes.io/projected/ecd88680-0e48-4c4f-9568-9f63dcd3e127-kube-api-access-gn9gv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.810233 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 15 17:15:39 crc kubenswrapper[4585]: I0215 17:15:39.814263 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:40 crc kubenswrapper[4585]: I0215 17:15:40.297348 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk"] Feb 15 17:15:41 crc kubenswrapper[4585]: I0215 17:15:41.136328 4585 generic.go:334] "Generic (PLEG): container finished" podID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerID="1350a7abc320cd9b16317ce229eacec50827a3cf79757ff17314d75c4344dd5e" exitCode=0 Feb 15 17:15:41 crc kubenswrapper[4585]: I0215 17:15:41.136487 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" event={"ID":"ecd88680-0e48-4c4f-9568-9f63dcd3e127","Type":"ContainerDied","Data":"1350a7abc320cd9b16317ce229eacec50827a3cf79757ff17314d75c4344dd5e"} Feb 15 17:15:41 crc kubenswrapper[4585]: I0215 17:15:41.137326 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" event={"ID":"ecd88680-0e48-4c4f-9568-9f63dcd3e127","Type":"ContainerStarted","Data":"bee114d5b6950d0f243d913ff4f1ff984273b550f981328b0772ba95b143e3a5"} Feb 15 17:15:43 crc kubenswrapper[4585]: I0215 17:15:43.161557 4585 generic.go:334] "Generic (PLEG): container finished" podID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerID="438754d7cbcc92dc91a1935d76dfd58c3dca1531212963c83b3be89763b242fb" exitCode=0 Feb 15 17:15:43 crc kubenswrapper[4585]: I0215 17:15:43.161688 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" event={"ID":"ecd88680-0e48-4c4f-9568-9f63dcd3e127","Type":"ContainerDied","Data":"438754d7cbcc92dc91a1935d76dfd58c3dca1531212963c83b3be89763b242fb"} Feb 15 17:15:43 crc kubenswrapper[4585]: I0215 17:15:43.840897 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:43 crc kubenswrapper[4585]: I0215 17:15:43.852535 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:44 crc kubenswrapper[4585]: I0215 17:15:44.172781 4585 generic.go:334] "Generic (PLEG): container finished" podID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerID="5db74549d80ebee5ea5abdf80696b0ea84a05cfe0a1a8348d94aefc5f5773b9b" exitCode=0 Feb 15 17:15:44 crc kubenswrapper[4585]: I0215 17:15:44.172872 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" event={"ID":"ecd88680-0e48-4c4f-9568-9f63dcd3e127","Type":"ContainerDied","Data":"5db74549d80ebee5ea5abdf80696b0ea84a05cfe0a1a8348d94aefc5f5773b9b"} Feb 15 17:15:44 crc kubenswrapper[4585]: I0215 17:15:44.343591 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf"] Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.189150 4585 generic.go:334] "Generic (PLEG): container finished" podID="11020ade-99c4-480e-bb24-5e0051a2b870" containerID="37de8d54a840b44884ea91565c0ba9da0456ddade5f48def5a0215d0fa231949" exitCode=0 Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.189228 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" event={"ID":"11020ade-99c4-480e-bb24-5e0051a2b870","Type":"ContainerDied","Data":"37de8d54a840b44884ea91565c0ba9da0456ddade5f48def5a0215d0fa231949"} Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.189676 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" event={"ID":"11020ade-99c4-480e-bb24-5e0051a2b870","Type":"ContainerStarted","Data":"edf26dec2e6ed7d6f44b197e388bab7f82e1dbac800b5ae01f4b0f812a287e35"} Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.270734 4585 scope.go:117] "RemoveContainer" containerID="8d3a75c91f9fab024687eea6a4d22a23b342fcfe4a988c42365e274544ad0fa8" Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.472235 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.656663 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-util\") pod \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.656897 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn9gv\" (UniqueName: \"kubernetes.io/projected/ecd88680-0e48-4c4f-9568-9f63dcd3e127-kube-api-access-gn9gv\") pod \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.656982 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-bundle\") pod \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\" (UID: \"ecd88680-0e48-4c4f-9568-9f63dcd3e127\") " Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.657431 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-bundle" (OuterVolumeSpecName: "bundle") pod "ecd88680-0e48-4c4f-9568-9f63dcd3e127" (UID: "ecd88680-0e48-4c4f-9568-9f63dcd3e127"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.666740 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd88680-0e48-4c4f-9568-9f63dcd3e127-kube-api-access-gn9gv" (OuterVolumeSpecName: "kube-api-access-gn9gv") pod "ecd88680-0e48-4c4f-9568-9f63dcd3e127" (UID: "ecd88680-0e48-4c4f-9568-9f63dcd3e127"). InnerVolumeSpecName "kube-api-access-gn9gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.668958 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-util" (OuterVolumeSpecName: "util") pod "ecd88680-0e48-4c4f-9568-9f63dcd3e127" (UID: "ecd88680-0e48-4c4f-9568-9f63dcd3e127"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.758664 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn9gv\" (UniqueName: \"kubernetes.io/projected/ecd88680-0e48-4c4f-9568-9f63dcd3e127-kube-api-access-gn9gv\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.758692 4585 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:45 crc kubenswrapper[4585]: I0215 17:15:45.758700 4585 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecd88680-0e48-4c4f-9568-9f63dcd3e127-util\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.221375 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" event={"ID":"ecd88680-0e48-4c4f-9568-9f63dcd3e127","Type":"ContainerDied","Data":"bee114d5b6950d0f243d913ff4f1ff984273b550f981328b0772ba95b143e3a5"} Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.221445 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bee114d5b6950d0f243d913ff4f1ff984273b550f981328b0772ba95b143e3a5" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.222128 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.225531 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4ps2_70645395-8d49-4495-a647-b6d43206ecbc/kube-multus/2.log" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.463356 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.570747 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11020ade-99c4-480e-bb24-5e0051a2b870-config-volume\") pod \"11020ade-99c4-480e-bb24-5e0051a2b870\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.570805 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11020ade-99c4-480e-bb24-5e0051a2b870-secret-volume\") pod \"11020ade-99c4-480e-bb24-5e0051a2b870\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.571004 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbp9t\" (UniqueName: \"kubernetes.io/projected/11020ade-99c4-480e-bb24-5e0051a2b870-kube-api-access-nbp9t\") pod \"11020ade-99c4-480e-bb24-5e0051a2b870\" (UID: \"11020ade-99c4-480e-bb24-5e0051a2b870\") " Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.571161 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11020ade-99c4-480e-bb24-5e0051a2b870-config-volume" (OuterVolumeSpecName: "config-volume") pod "11020ade-99c4-480e-bb24-5e0051a2b870" (UID: "11020ade-99c4-480e-bb24-5e0051a2b870"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.571339 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11020ade-99c4-480e-bb24-5e0051a2b870-config-volume\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.575354 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11020ade-99c4-480e-bb24-5e0051a2b870-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "11020ade-99c4-480e-bb24-5e0051a2b870" (UID: "11020ade-99c4-480e-bb24-5e0051a2b870"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.575519 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11020ade-99c4-480e-bb24-5e0051a2b870-kube-api-access-nbp9t" (OuterVolumeSpecName: "kube-api-access-nbp9t") pod "11020ade-99c4-480e-bb24-5e0051a2b870" (UID: "11020ade-99c4-480e-bb24-5e0051a2b870"). InnerVolumeSpecName "kube-api-access-nbp9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.672457 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11020ade-99c4-480e-bb24-5e0051a2b870-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:46 crc kubenswrapper[4585]: I0215 17:15:46.672486 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbp9t\" (UniqueName: \"kubernetes.io/projected/11020ade-99c4-480e-bb24-5e0051a2b870-kube-api-access-nbp9t\") on node \"crc\" DevicePath \"\"" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.239130 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" event={"ID":"11020ade-99c4-480e-bb24-5e0051a2b870","Type":"ContainerDied","Data":"edf26dec2e6ed7d6f44b197e388bab7f82e1dbac800b5ae01f4b0f812a287e35"} Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.239516 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edf26dec2e6ed7d6f44b197e388bab7f82e1dbac800b5ae01f4b0f812a287e35" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.239223 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.853790 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rbvbt"] Feb 15 17:15:47 crc kubenswrapper[4585]: E0215 17:15:47.854054 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerName="extract" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.854070 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerName="extract" Feb 15 17:15:47 crc kubenswrapper[4585]: E0215 17:15:47.854086 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerName="util" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.854093 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerName="util" Feb 15 17:15:47 crc kubenswrapper[4585]: E0215 17:15:47.854108 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerName="pull" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.854113 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerName="pull" Feb 15 17:15:47 crc kubenswrapper[4585]: E0215 17:15:47.854138 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11020ade-99c4-480e-bb24-5e0051a2b870" containerName="collect-profiles" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.854144 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="11020ade-99c4-480e-bb24-5e0051a2b870" containerName="collect-profiles" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.854294 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd88680-0e48-4c4f-9568-9f63dcd3e127" containerName="extract" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.854304 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="11020ade-99c4-480e-bb24-5e0051a2b870" containerName="collect-profiles" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.854730 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rbvbt" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.859483 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.859643 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.868246 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rbvbt"] Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.872517 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-lxq2g" Feb 15 17:15:47 crc kubenswrapper[4585]: I0215 17:15:47.996211 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdmg8\" (UniqueName: \"kubernetes.io/projected/db3179ce-d468-434b-9f2c-7fce08fb2ce3-kube-api-access-cdmg8\") pod \"nmstate-operator-694c9596b7-rbvbt\" (UID: \"db3179ce-d468-434b-9f2c-7fce08fb2ce3\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rbvbt" Feb 15 17:15:48 crc kubenswrapper[4585]: I0215 17:15:48.097681 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdmg8\" (UniqueName: \"kubernetes.io/projected/db3179ce-d468-434b-9f2c-7fce08fb2ce3-kube-api-access-cdmg8\") pod \"nmstate-operator-694c9596b7-rbvbt\" (UID: \"db3179ce-d468-434b-9f2c-7fce08fb2ce3\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rbvbt" Feb 15 17:15:48 crc kubenswrapper[4585]: I0215 17:15:48.116627 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdmg8\" (UniqueName: \"kubernetes.io/projected/db3179ce-d468-434b-9f2c-7fce08fb2ce3-kube-api-access-cdmg8\") pod \"nmstate-operator-694c9596b7-rbvbt\" (UID: \"db3179ce-d468-434b-9f2c-7fce08fb2ce3\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rbvbt" Feb 15 17:15:48 crc kubenswrapper[4585]: I0215 17:15:48.216439 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rbvbt" Feb 15 17:15:48 crc kubenswrapper[4585]: I0215 17:15:48.465253 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rbvbt"] Feb 15 17:15:49 crc kubenswrapper[4585]: I0215 17:15:49.253222 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rbvbt" event={"ID":"db3179ce-d468-434b-9f2c-7fce08fb2ce3","Type":"ContainerStarted","Data":"923f0d494b4d0cb50e2c7c3fbf9e1097357fd3c5e93dd166b6126f7a31c28ad5"} Feb 15 17:15:51 crc kubenswrapper[4585]: I0215 17:15:51.270457 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rbvbt" event={"ID":"db3179ce-d468-434b-9f2c-7fce08fb2ce3","Type":"ContainerStarted","Data":"427323234c568abe5b63900f8c1c6de83cd812a50ce5c6688bbdd4bf18fab896"} Feb 15 17:15:51 crc kubenswrapper[4585]: I0215 17:15:51.291994 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-rbvbt" podStartSLOduration=2.389364752 podStartE2EDuration="4.291977002s" podCreationTimestamp="2026-02-15 17:15:47 +0000 UTC" firstStartedPulling="2026-02-15 17:15:48.482953121 +0000 UTC m=+604.426361263" lastFinishedPulling="2026-02-15 17:15:50.385565361 +0000 UTC m=+606.328973513" observedRunningTime="2026-02-15 17:15:51.287054219 +0000 UTC m=+607.230462351" watchObservedRunningTime="2026-02-15 17:15:51.291977002 +0000 UTC m=+607.235385134" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.037322 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.039534 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.042222 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-kv9w2" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.048938 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.087412 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.088261 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.092913 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.112429 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-9zlpw"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.113277 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.117934 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.172026 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktlb7\" (UniqueName: \"kubernetes.io/projected/fd6e126c-f69c-4a79-932b-976d3cb97f83-kube-api-access-ktlb7\") pod \"nmstate-metrics-58c85c668d-d8x4k\" (UID: \"fd6e126c-f69c-4a79-932b-976d3cb97f83\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.219804 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.220861 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.223962 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.224167 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.224274 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-dd6sv" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.244657 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.273692 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlzs\" (UniqueName: \"kubernetes.io/projected/b3c69254-35c7-4f91-b059-6a72be7af29f-kube-api-access-qrlzs\") pod \"nmstate-webhook-866bcb46dc-zk68t\" (UID: \"b3c69254-35c7-4f91-b059-6a72be7af29f\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.273733 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkzp7\" (UniqueName: \"kubernetes.io/projected/7e13449a-7362-4fca-98c9-8ba86698e6e7-kube-api-access-dkzp7\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.273766 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b3c69254-35c7-4f91-b059-6a72be7af29f-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-zk68t\" (UID: \"b3c69254-35c7-4f91-b059-6a72be7af29f\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.273792 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7e13449a-7362-4fca-98c9-8ba86698e6e7-ovs-socket\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.273943 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7e13449a-7362-4fca-98c9-8ba86698e6e7-nmstate-lock\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.273990 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktlb7\" (UniqueName: \"kubernetes.io/projected/fd6e126c-f69c-4a79-932b-976d3cb97f83-kube-api-access-ktlb7\") pod \"nmstate-metrics-58c85c668d-d8x4k\" (UID: \"fd6e126c-f69c-4a79-932b-976d3cb97f83\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.274052 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7e13449a-7362-4fca-98c9-8ba86698e6e7-dbus-socket\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.298447 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktlb7\" (UniqueName: \"kubernetes.io/projected/fd6e126c-f69c-4a79-932b-976d3cb97f83-kube-api-access-ktlb7\") pod \"nmstate-metrics-58c85c668d-d8x4k\" (UID: \"fd6e126c-f69c-4a79-932b-976d3cb97f83\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.362956 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375321 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/06076f8a-c221-46f7-a72b-2287367d08c8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375365 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/06076f8a-c221-46f7-a72b-2287367d08c8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375389 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlzs\" (UniqueName: \"kubernetes.io/projected/b3c69254-35c7-4f91-b059-6a72be7af29f-kube-api-access-qrlzs\") pod \"nmstate-webhook-866bcb46dc-zk68t\" (UID: \"b3c69254-35c7-4f91-b059-6a72be7af29f\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375412 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkzp7\" (UniqueName: \"kubernetes.io/projected/7e13449a-7362-4fca-98c9-8ba86698e6e7-kube-api-access-dkzp7\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375444 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b3c69254-35c7-4f91-b059-6a72be7af29f-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-zk68t\" (UID: \"b3c69254-35c7-4f91-b059-6a72be7af29f\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375469 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7e13449a-7362-4fca-98c9-8ba86698e6e7-ovs-socket\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375497 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7e13449a-7362-4fca-98c9-8ba86698e6e7-nmstate-lock\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375523 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cfbm\" (UniqueName: \"kubernetes.io/projected/06076f8a-c221-46f7-a72b-2287367d08c8-kube-api-access-7cfbm\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375550 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7e13449a-7362-4fca-98c9-8ba86698e6e7-dbus-socket\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.375851 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7e13449a-7362-4fca-98c9-8ba86698e6e7-dbus-socket\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.376067 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7e13449a-7362-4fca-98c9-8ba86698e6e7-ovs-socket\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.376180 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7e13449a-7362-4fca-98c9-8ba86698e6e7-nmstate-lock\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: E0215 17:15:59.376268 4585 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 15 17:15:59 crc kubenswrapper[4585]: E0215 17:15:59.376307 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3c69254-35c7-4f91-b059-6a72be7af29f-tls-key-pair podName:b3c69254-35c7-4f91-b059-6a72be7af29f nodeName:}" failed. No retries permitted until 2026-02-15 17:15:59.876293837 +0000 UTC m=+615.819701959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/b3c69254-35c7-4f91-b059-6a72be7af29f-tls-key-pair") pod "nmstate-webhook-866bcb46dc-zk68t" (UID: "b3c69254-35c7-4f91-b059-6a72be7af29f") : secret "openshift-nmstate-webhook" not found Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.406767 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlzs\" (UniqueName: \"kubernetes.io/projected/b3c69254-35c7-4f91-b059-6a72be7af29f-kube-api-access-qrlzs\") pod \"nmstate-webhook-866bcb46dc-zk68t\" (UID: \"b3c69254-35c7-4f91-b059-6a72be7af29f\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.406937 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkzp7\" (UniqueName: \"kubernetes.io/projected/7e13449a-7362-4fca-98c9-8ba86698e6e7-kube-api-access-dkzp7\") pod \"nmstate-handler-9zlpw\" (UID: \"7e13449a-7362-4fca-98c9-8ba86698e6e7\") " pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.433709 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.476583 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/06076f8a-c221-46f7-a72b-2287367d08c8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.476643 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/06076f8a-c221-46f7-a72b-2287367d08c8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.476731 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cfbm\" (UniqueName: \"kubernetes.io/projected/06076f8a-c221-46f7-a72b-2287367d08c8-kube-api-access-7cfbm\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.477457 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/06076f8a-c221-46f7-a72b-2287367d08c8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: E0215 17:15:59.477821 4585 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 15 17:15:59 crc kubenswrapper[4585]: E0215 17:15:59.477938 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06076f8a-c221-46f7-a72b-2287367d08c8-plugin-serving-cert podName:06076f8a-c221-46f7-a72b-2287367d08c8 nodeName:}" failed. No retries permitted until 2026-02-15 17:15:59.977923134 +0000 UTC m=+615.921331266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/06076f8a-c221-46f7-a72b-2287367d08c8-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-v6dds" (UID: "06076f8a-c221-46f7-a72b-2287367d08c8") : secret "plugin-serving-cert" not found Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.482431 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f8886bc7c-m7j5m"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.483264 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.511246 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cfbm\" (UniqueName: \"kubernetes.io/projected/06076f8a-c221-46f7-a72b-2287367d08c8-kube-api-access-7cfbm\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.523120 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f8886bc7c-m7j5m"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.679753 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb374992-e98c-4f12-acaf-2e52a9c782dc-console-oauth-config\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.679809 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-console-config\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.680029 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-trusted-ca-bundle\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.680142 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb374992-e98c-4f12-acaf-2e52a9c782dc-console-serving-cert\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.680242 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-oauth-serving-cert\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.680303 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlcvs\" (UniqueName: \"kubernetes.io/projected/cb374992-e98c-4f12-acaf-2e52a9c782dc-kube-api-access-jlcvs\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.680339 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-service-ca\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.781447 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-trusted-ca-bundle\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.781518 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb374992-e98c-4f12-acaf-2e52a9c782dc-console-serving-cert\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.781539 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-oauth-serving-cert\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.781560 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlcvs\" (UniqueName: \"kubernetes.io/projected/cb374992-e98c-4f12-acaf-2e52a9c782dc-kube-api-access-jlcvs\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.781579 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-service-ca\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.781633 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb374992-e98c-4f12-acaf-2e52a9c782dc-console-oauth-config\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.781653 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-console-config\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.782727 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-console-config\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.783197 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-service-ca\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.783807 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-trusted-ca-bundle\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.783922 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cb374992-e98c-4f12-acaf-2e52a9c782dc-oauth-serving-cert\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.787466 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb374992-e98c-4f12-acaf-2e52a9c782dc-console-serving-cert\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.787942 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cb374992-e98c-4f12-acaf-2e52a9c782dc-console-oauth-config\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.799268 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlcvs\" (UniqueName: \"kubernetes.io/projected/cb374992-e98c-4f12-acaf-2e52a9c782dc-kube-api-access-jlcvs\") pod \"console-6f8886bc7c-m7j5m\" (UID: \"cb374992-e98c-4f12-acaf-2e52a9c782dc\") " pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.840897 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.882806 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b3c69254-35c7-4f91-b059-6a72be7af29f-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-zk68t\" (UID: \"b3c69254-35c7-4f91-b059-6a72be7af29f\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.888844 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b3c69254-35c7-4f91-b059-6a72be7af29f-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-zk68t\" (UID: \"b3c69254-35c7-4f91-b059-6a72be7af29f\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.966115 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k"] Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.985060 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/06076f8a-c221-46f7-a72b-2287367d08c8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:15:59 crc kubenswrapper[4585]: I0215 17:15:59.993151 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/06076f8a-c221-46f7-a72b-2287367d08c8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-v6dds\" (UID: \"06076f8a-c221-46f7-a72b-2287367d08c8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.013346 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.101380 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f8886bc7c-m7j5m"] Feb 15 17:16:00 crc kubenswrapper[4585]: W0215 17:16:00.114684 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb374992_e98c_4f12_acaf_2e52a9c782dc.slice/crio-90932bd129739d67ec1bff1ae782638d08060a810e65abda3df97e72c9023da0 WatchSource:0}: Error finding container 90932bd129739d67ec1bff1ae782638d08060a810e65abda3df97e72c9023da0: Status 404 returned error can't find the container with id 90932bd129739d67ec1bff1ae782638d08060a810e65abda3df97e72c9023da0 Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.140212 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.262960 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t"] Feb 15 17:16:00 crc kubenswrapper[4585]: W0215 17:16:00.271254 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c69254_35c7_4f91_b059_6a72be7af29f.slice/crio-a21cef5ab096e5e164ecd0895af521d2de163b4a1003828a70ccecb9f2586d55 WatchSource:0}: Error finding container a21cef5ab096e5e164ecd0895af521d2de163b4a1003828a70ccecb9f2586d55: Status 404 returned error can't find the container with id a21cef5ab096e5e164ecd0895af521d2de163b4a1003828a70ccecb9f2586d55 Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.360827 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" event={"ID":"b3c69254-35c7-4f91-b059-6a72be7af29f","Type":"ContainerStarted","Data":"a21cef5ab096e5e164ecd0895af521d2de163b4a1003828a70ccecb9f2586d55"} Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.366126 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8886bc7c-m7j5m" event={"ID":"cb374992-e98c-4f12-acaf-2e52a9c782dc","Type":"ContainerStarted","Data":"e6f16a4189b9917e22671053b669de71ecafee1109e4e5312df9f098f2e9f604"} Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.366191 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8886bc7c-m7j5m" event={"ID":"cb374992-e98c-4f12-acaf-2e52a9c782dc","Type":"ContainerStarted","Data":"90932bd129739d67ec1bff1ae782638d08060a810e65abda3df97e72c9023da0"} Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.370794 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds"] Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.371179 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k" event={"ID":"fd6e126c-f69c-4a79-932b-976d3cb97f83","Type":"ContainerStarted","Data":"1f375288ef7b8a9e8d6ccf54402481d243936f170b02a9ed2f734154009dfcef"} Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.383176 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9zlpw" event={"ID":"7e13449a-7362-4fca-98c9-8ba86698e6e7","Type":"ContainerStarted","Data":"3a3f17722d3a77b147bea29c9ef744582377449d6c42ba3ab8fa282eaf3d1ee7"} Feb 15 17:16:00 crc kubenswrapper[4585]: I0215 17:16:00.392551 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f8886bc7c-m7j5m" podStartSLOduration=1.392533594 podStartE2EDuration="1.392533594s" podCreationTimestamp="2026-02-15 17:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:16:00.389188375 +0000 UTC m=+616.332596497" watchObservedRunningTime="2026-02-15 17:16:00.392533594 +0000 UTC m=+616.335941726" Feb 15 17:16:01 crc kubenswrapper[4585]: I0215 17:16:01.397077 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" event={"ID":"06076f8a-c221-46f7-a72b-2287367d08c8","Type":"ContainerStarted","Data":"80166c58af30aa3c75756270c4894d76f2722dc48468ff11919857ac7d463330"} Feb 15 17:16:03 crc kubenswrapper[4585]: I0215 17:16:03.418945 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" event={"ID":"06076f8a-c221-46f7-a72b-2287367d08c8","Type":"ContainerStarted","Data":"f282d982a941f7529a73dfe6c17130fcf8d7b9c3f121d49c6b285760258c2ffb"} Feb 15 17:16:03 crc kubenswrapper[4585]: I0215 17:16:03.423048 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" event={"ID":"b3c69254-35c7-4f91-b059-6a72be7af29f","Type":"ContainerStarted","Data":"97b5691714362de5bbdc889b4cd515159dc22f79845f05dba65b6fb1db4dc641"} Feb 15 17:16:03 crc kubenswrapper[4585]: I0215 17:16:03.423829 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:16:03 crc kubenswrapper[4585]: I0215 17:16:03.425403 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k" event={"ID":"fd6e126c-f69c-4a79-932b-976d3cb97f83","Type":"ContainerStarted","Data":"0fb28a99ee71eacfde8627e022d3c38f5c50c537106961c86d819d8defc48563"} Feb 15 17:16:03 crc kubenswrapper[4585]: I0215 17:16:03.427567 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9zlpw" event={"ID":"7e13449a-7362-4fca-98c9-8ba86698e6e7","Type":"ContainerStarted","Data":"4df424bfa958846f8216b30ae3377d17a788e5f5e95aa8d2e5c9b1035a610d3c"} Feb 15 17:16:03 crc kubenswrapper[4585]: I0215 17:16:03.428323 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:16:03 crc kubenswrapper[4585]: I0215 17:16:03.453647 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-v6dds" podStartSLOduration=2.126075327 podStartE2EDuration="4.453622929s" podCreationTimestamp="2026-02-15 17:15:59 +0000 UTC" firstStartedPulling="2026-02-15 17:16:00.382813404 +0000 UTC m=+616.326221526" lastFinishedPulling="2026-02-15 17:16:02.710360996 +0000 UTC m=+618.653769128" observedRunningTime="2026-02-15 17:16:03.436742986 +0000 UTC m=+619.380151138" watchObservedRunningTime="2026-02-15 17:16:03.453622929 +0000 UTC m=+619.397031071" Feb 15 17:16:03 crc kubenswrapper[4585]: I0215 17:16:03.502106 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-9zlpw" podStartSLOduration=1.358429508 podStartE2EDuration="4.502087469s" podCreationTimestamp="2026-02-15 17:15:59 +0000 UTC" firstStartedPulling="2026-02-15 17:15:59.510404385 +0000 UTC m=+615.453812517" lastFinishedPulling="2026-02-15 17:16:02.654062346 +0000 UTC m=+618.597470478" observedRunningTime="2026-02-15 17:16:03.489940563 +0000 UTC m=+619.433348705" watchObservedRunningTime="2026-02-15 17:16:03.502087469 +0000 UTC m=+619.445495601" Feb 15 17:16:03 crc kubenswrapper[4585]: I0215 17:16:03.520237 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" podStartSLOduration=2.068841071 podStartE2EDuration="4.520218916s" podCreationTimestamp="2026-02-15 17:15:59 +0000 UTC" firstStartedPulling="2026-02-15 17:16:00.274630871 +0000 UTC m=+616.218039003" lastFinishedPulling="2026-02-15 17:16:02.726008716 +0000 UTC m=+618.669416848" observedRunningTime="2026-02-15 17:16:03.517160054 +0000 UTC m=+619.460568206" watchObservedRunningTime="2026-02-15 17:16:03.520218916 +0000 UTC m=+619.463627048" Feb 15 17:16:05 crc kubenswrapper[4585]: I0215 17:16:05.449161 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k" event={"ID":"fd6e126c-f69c-4a79-932b-976d3cb97f83","Type":"ContainerStarted","Data":"1a21d3d396e47350c6d5b429f1734427a3046f4874f69654b60f5db0991f53b6"} Feb 15 17:16:05 crc kubenswrapper[4585]: I0215 17:16:05.479932 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-d8x4k" podStartSLOduration=1.5148507869999999 podStartE2EDuration="6.479904617s" podCreationTimestamp="2026-02-15 17:15:59 +0000 UTC" firstStartedPulling="2026-02-15 17:15:59.979351989 +0000 UTC m=+615.922760161" lastFinishedPulling="2026-02-15 17:16:04.944405859 +0000 UTC m=+620.887813991" observedRunningTime="2026-02-15 17:16:05.469896689 +0000 UTC m=+621.413304861" watchObservedRunningTime="2026-02-15 17:16:05.479904617 +0000 UTC m=+621.423312749" Feb 15 17:16:09 crc kubenswrapper[4585]: I0215 17:16:09.463821 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-9zlpw" Feb 15 17:16:09 crc kubenswrapper[4585]: I0215 17:16:09.841371 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:16:09 crc kubenswrapper[4585]: I0215 17:16:09.841752 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:16:09 crc kubenswrapper[4585]: I0215 17:16:09.848212 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:16:10 crc kubenswrapper[4585]: I0215 17:16:10.515130 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f8886bc7c-m7j5m" Feb 15 17:16:10 crc kubenswrapper[4585]: I0215 17:16:10.635918 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gfx64"] Feb 15 17:16:20 crc kubenswrapper[4585]: I0215 17:16:20.024521 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-zk68t" Feb 15 17:16:35 crc kubenswrapper[4585]: I0215 17:16:35.687619 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-gfx64" podUID="7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" containerName="console" containerID="cri-o://e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427" gracePeriod=15 Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.085394 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gfx64_7bd79bc9-3114-42f2-9a8f-53e39a6abe5c/console/0.log" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.086133 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.179001 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85246\" (UniqueName: \"kubernetes.io/projected/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-kube-api-access-85246\") pod \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.179059 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-trusted-ca-bundle\") pod \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.179108 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-config\") pod \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.179136 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-oauth-serving-cert\") pod \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.179175 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-service-ca\") pod \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.179191 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-oauth-config\") pod \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.179216 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-serving-cert\") pod \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\" (UID: \"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c\") " Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.179779 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" (UID: "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.180155 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" (UID: "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.180470 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-config" (OuterVolumeSpecName: "console-config") pod "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" (UID: "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.180837 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-service-ca" (OuterVolumeSpecName: "service-ca") pod "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" (UID: "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.187189 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" (UID: "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.189368 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" (UID: "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.204077 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-kube-api-access-85246" (OuterVolumeSpecName: "kube-api-access-85246") pod "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" (UID: "7bd79bc9-3114-42f2-9a8f-53e39a6abe5c"). InnerVolumeSpecName "kube-api-access-85246". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.281925 4585 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.281954 4585 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.281963 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-service-ca\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.281974 4585 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.281983 4585 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.281992 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85246\" (UniqueName: \"kubernetes.io/projected/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-kube-api-access-85246\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.282002 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.767776 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-gfx64_7bd79bc9-3114-42f2-9a8f-53e39a6abe5c/console/0.log" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.768052 4585 generic.go:334] "Generic (PLEG): container finished" podID="7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" containerID="e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427" exitCode=2 Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.768088 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gfx64" event={"ID":"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c","Type":"ContainerDied","Data":"e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427"} Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.768124 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-gfx64" event={"ID":"7bd79bc9-3114-42f2-9a8f-53e39a6abe5c","Type":"ContainerDied","Data":"0e992700581df520d0d114aaaf6cd92d0f8386a053cd4dc5a8f000c1a2c61642"} Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.768146 4585 scope.go:117] "RemoveContainer" containerID="e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.768154 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-gfx64" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.791617 4585 scope.go:117] "RemoveContainer" containerID="e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427" Feb 15 17:16:36 crc kubenswrapper[4585]: E0215 17:16:36.792519 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427\": container with ID starting with e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427 not found: ID does not exist" containerID="e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.794022 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427"} err="failed to get container status \"e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427\": rpc error: code = NotFound desc = could not find container \"e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427\": container with ID starting with e547dab6991f09fedb5017166e8ece9ce0e2a240c34288b09bee0e2bf4908427 not found: ID does not exist" Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.807468 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-gfx64"] Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.817263 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-gfx64"] Feb 15 17:16:36 crc kubenswrapper[4585]: I0215 17:16:36.855899 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" path="/var/lib/kubelet/pods/7bd79bc9-3114-42f2-9a8f-53e39a6abe5c/volumes" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.208123 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll"] Feb 15 17:16:42 crc kubenswrapper[4585]: E0215 17:16:42.209929 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" containerName="console" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.210008 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" containerName="console" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.210245 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd79bc9-3114-42f2-9a8f-53e39a6abe5c" containerName="console" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.211552 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.213821 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.223674 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll"] Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.283701 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwnsh\" (UniqueName: \"kubernetes.io/projected/64b64630-6199-44c9-811f-4bba668cf494-kube-api-access-cwnsh\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.283748 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.283793 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.385061 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwnsh\" (UniqueName: \"kubernetes.io/projected/64b64630-6199-44c9-811f-4bba668cf494-kube-api-access-cwnsh\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.385150 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.385233 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.386070 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.386139 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.406983 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwnsh\" (UniqueName: \"kubernetes.io/projected/64b64630-6199-44c9-811f-4bba668cf494-kube-api-access-cwnsh\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:42 crc kubenswrapper[4585]: I0215 17:16:42.538728 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:43 crc kubenswrapper[4585]: I0215 17:16:43.061959 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll"] Feb 15 17:16:43 crc kubenswrapper[4585]: I0215 17:16:43.838168 4585 generic.go:334] "Generic (PLEG): container finished" podID="64b64630-6199-44c9-811f-4bba668cf494" containerID="0463d7dcea3f2fe84a848e5d37d0d9eaf8cb55753dd9387ea724d1fe5b83e066" exitCode=0 Feb 15 17:16:43 crc kubenswrapper[4585]: I0215 17:16:43.838409 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" event={"ID":"64b64630-6199-44c9-811f-4bba668cf494","Type":"ContainerDied","Data":"0463d7dcea3f2fe84a848e5d37d0d9eaf8cb55753dd9387ea724d1fe5b83e066"} Feb 15 17:16:43 crc kubenswrapper[4585]: I0215 17:16:43.838460 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" event={"ID":"64b64630-6199-44c9-811f-4bba668cf494","Type":"ContainerStarted","Data":"562fbc7fe105b0c387481b8e1647099e040a6e46afefdbe7bd6e46296efb2a97"} Feb 15 17:16:45 crc kubenswrapper[4585]: I0215 17:16:45.864518 4585 generic.go:334] "Generic (PLEG): container finished" podID="64b64630-6199-44c9-811f-4bba668cf494" containerID="1d3518be9e55a081cf7f1be6e10aa4948123e9c448a924f676e86fc8c2fb04a2" exitCode=0 Feb 15 17:16:45 crc kubenswrapper[4585]: I0215 17:16:45.864629 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" event={"ID":"64b64630-6199-44c9-811f-4bba668cf494","Type":"ContainerDied","Data":"1d3518be9e55a081cf7f1be6e10aa4948123e9c448a924f676e86fc8c2fb04a2"} Feb 15 17:16:46 crc kubenswrapper[4585]: I0215 17:16:46.895654 4585 generic.go:334] "Generic (PLEG): container finished" podID="64b64630-6199-44c9-811f-4bba668cf494" containerID="1fbea8ed0c60851dc8fc960a87f0a55defc58b595326cecfe6162be48f3879f0" exitCode=0 Feb 15 17:16:46 crc kubenswrapper[4585]: I0215 17:16:46.895857 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" event={"ID":"64b64630-6199-44c9-811f-4bba668cf494","Type":"ContainerDied","Data":"1fbea8ed0c60851dc8fc960a87f0a55defc58b595326cecfe6162be48f3879f0"} Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.231747 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.416117 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-util\") pod \"64b64630-6199-44c9-811f-4bba668cf494\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.416226 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-bundle\") pod \"64b64630-6199-44c9-811f-4bba668cf494\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.416339 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwnsh\" (UniqueName: \"kubernetes.io/projected/64b64630-6199-44c9-811f-4bba668cf494-kube-api-access-cwnsh\") pod \"64b64630-6199-44c9-811f-4bba668cf494\" (UID: \"64b64630-6199-44c9-811f-4bba668cf494\") " Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.419288 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-bundle" (OuterVolumeSpecName: "bundle") pod "64b64630-6199-44c9-811f-4bba668cf494" (UID: "64b64630-6199-44c9-811f-4bba668cf494"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.428422 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b64630-6199-44c9-811f-4bba668cf494-kube-api-access-cwnsh" (OuterVolumeSpecName: "kube-api-access-cwnsh") pod "64b64630-6199-44c9-811f-4bba668cf494" (UID: "64b64630-6199-44c9-811f-4bba668cf494"). InnerVolumeSpecName "kube-api-access-cwnsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.439723 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-util" (OuterVolumeSpecName: "util") pod "64b64630-6199-44c9-811f-4bba668cf494" (UID: "64b64630-6199-44c9-811f-4bba668cf494"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.518480 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwnsh\" (UniqueName: \"kubernetes.io/projected/64b64630-6199-44c9-811f-4bba668cf494-kube-api-access-cwnsh\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.518531 4585 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-util\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.518550 4585 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64b64630-6199-44c9-811f-4bba668cf494-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.924937 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" event={"ID":"64b64630-6199-44c9-811f-4bba668cf494","Type":"ContainerDied","Data":"562fbc7fe105b0c387481b8e1647099e040a6e46afefdbe7bd6e46296efb2a97"} Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.924976 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="562fbc7fe105b0c387481b8e1647099e040a6e46afefdbe7bd6e46296efb2a97" Feb 15 17:16:48 crc kubenswrapper[4585]: I0215 17:16:48.925070 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.425275 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp"] Feb 15 17:17:00 crc kubenswrapper[4585]: E0215 17:17:00.426046 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b64630-6199-44c9-811f-4bba668cf494" containerName="util" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.426059 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b64630-6199-44c9-811f-4bba668cf494" containerName="util" Feb 15 17:17:00 crc kubenswrapper[4585]: E0215 17:17:00.426069 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b64630-6199-44c9-811f-4bba668cf494" containerName="pull" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.426075 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b64630-6199-44c9-811f-4bba668cf494" containerName="pull" Feb 15 17:17:00 crc kubenswrapper[4585]: E0215 17:17:00.426095 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b64630-6199-44c9-811f-4bba668cf494" containerName="extract" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.426101 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b64630-6199-44c9-811f-4bba668cf494" containerName="extract" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.426237 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b64630-6199-44c9-811f-4bba668cf494" containerName="extract" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.426675 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.440857 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.440913 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.441067 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.442216 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-g7zrf" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.443524 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.457964 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp"] Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.607384 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0005f5d7-fa80-4a9c-90d4-dbe50d95a235-webhook-cert\") pod \"metallb-operator-controller-manager-769664b7f6-hp4wp\" (UID: \"0005f5d7-fa80-4a9c-90d4-dbe50d95a235\") " pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.607724 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh9jf\" (UniqueName: \"kubernetes.io/projected/0005f5d7-fa80-4a9c-90d4-dbe50d95a235-kube-api-access-jh9jf\") pod \"metallb-operator-controller-manager-769664b7f6-hp4wp\" (UID: \"0005f5d7-fa80-4a9c-90d4-dbe50d95a235\") " pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.607849 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0005f5d7-fa80-4a9c-90d4-dbe50d95a235-apiservice-cert\") pod \"metallb-operator-controller-manager-769664b7f6-hp4wp\" (UID: \"0005f5d7-fa80-4a9c-90d4-dbe50d95a235\") " pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.709102 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0005f5d7-fa80-4a9c-90d4-dbe50d95a235-webhook-cert\") pod \"metallb-operator-controller-manager-769664b7f6-hp4wp\" (UID: \"0005f5d7-fa80-4a9c-90d4-dbe50d95a235\") " pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.709421 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh9jf\" (UniqueName: \"kubernetes.io/projected/0005f5d7-fa80-4a9c-90d4-dbe50d95a235-kube-api-access-jh9jf\") pod \"metallb-operator-controller-manager-769664b7f6-hp4wp\" (UID: \"0005f5d7-fa80-4a9c-90d4-dbe50d95a235\") " pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.709471 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0005f5d7-fa80-4a9c-90d4-dbe50d95a235-apiservice-cert\") pod \"metallb-operator-controller-manager-769664b7f6-hp4wp\" (UID: \"0005f5d7-fa80-4a9c-90d4-dbe50d95a235\") " pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.715391 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0005f5d7-fa80-4a9c-90d4-dbe50d95a235-apiservice-cert\") pod \"metallb-operator-controller-manager-769664b7f6-hp4wp\" (UID: \"0005f5d7-fa80-4a9c-90d4-dbe50d95a235\") " pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.715405 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0005f5d7-fa80-4a9c-90d4-dbe50d95a235-webhook-cert\") pod \"metallb-operator-controller-manager-769664b7f6-hp4wp\" (UID: \"0005f5d7-fa80-4a9c-90d4-dbe50d95a235\") " pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.740783 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh9jf\" (UniqueName: \"kubernetes.io/projected/0005f5d7-fa80-4a9c-90d4-dbe50d95a235-kube-api-access-jh9jf\") pod \"metallb-operator-controller-manager-769664b7f6-hp4wp\" (UID: \"0005f5d7-fa80-4a9c-90d4-dbe50d95a235\") " pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.741562 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.779718 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz"] Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.780591 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.785950 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.785989 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wnszk" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.786161 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.806497 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz"] Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.919628 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3753870e-944b-4c36-aa33-deb44f4ccb64-webhook-cert\") pod \"metallb-operator-webhook-server-577b585f8-g2ggz\" (UID: \"3753870e-944b-4c36-aa33-deb44f4ccb64\") " pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.919874 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rd8g\" (UniqueName: \"kubernetes.io/projected/3753870e-944b-4c36-aa33-deb44f4ccb64-kube-api-access-9rd8g\") pod \"metallb-operator-webhook-server-577b585f8-g2ggz\" (UID: \"3753870e-944b-4c36-aa33-deb44f4ccb64\") " pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:00 crc kubenswrapper[4585]: I0215 17:17:00.919936 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3753870e-944b-4c36-aa33-deb44f4ccb64-apiservice-cert\") pod \"metallb-operator-webhook-server-577b585f8-g2ggz\" (UID: \"3753870e-944b-4c36-aa33-deb44f4ccb64\") " pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:01 crc kubenswrapper[4585]: I0215 17:17:01.021644 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3753870e-944b-4c36-aa33-deb44f4ccb64-webhook-cert\") pod \"metallb-operator-webhook-server-577b585f8-g2ggz\" (UID: \"3753870e-944b-4c36-aa33-deb44f4ccb64\") " pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:01 crc kubenswrapper[4585]: I0215 17:17:01.021692 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rd8g\" (UniqueName: \"kubernetes.io/projected/3753870e-944b-4c36-aa33-deb44f4ccb64-kube-api-access-9rd8g\") pod \"metallb-operator-webhook-server-577b585f8-g2ggz\" (UID: \"3753870e-944b-4c36-aa33-deb44f4ccb64\") " pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:01 crc kubenswrapper[4585]: I0215 17:17:01.021755 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3753870e-944b-4c36-aa33-deb44f4ccb64-apiservice-cert\") pod \"metallb-operator-webhook-server-577b585f8-g2ggz\" (UID: \"3753870e-944b-4c36-aa33-deb44f4ccb64\") " pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:01 crc kubenswrapper[4585]: I0215 17:17:01.027691 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3753870e-944b-4c36-aa33-deb44f4ccb64-webhook-cert\") pod \"metallb-operator-webhook-server-577b585f8-g2ggz\" (UID: \"3753870e-944b-4c36-aa33-deb44f4ccb64\") " pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:01 crc kubenswrapper[4585]: I0215 17:17:01.042086 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3753870e-944b-4c36-aa33-deb44f4ccb64-apiservice-cert\") pod \"metallb-operator-webhook-server-577b585f8-g2ggz\" (UID: \"3753870e-944b-4c36-aa33-deb44f4ccb64\") " pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:01 crc kubenswrapper[4585]: I0215 17:17:01.042234 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rd8g\" (UniqueName: \"kubernetes.io/projected/3753870e-944b-4c36-aa33-deb44f4ccb64-kube-api-access-9rd8g\") pod \"metallb-operator-webhook-server-577b585f8-g2ggz\" (UID: \"3753870e-944b-4c36-aa33-deb44f4ccb64\") " pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:01 crc kubenswrapper[4585]: I0215 17:17:01.060648 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp"] Feb 15 17:17:01 crc kubenswrapper[4585]: W0215 17:17:01.068690 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0005f5d7_fa80_4a9c_90d4_dbe50d95a235.slice/crio-32fc23240b89ae35dd42386a41ad4defcde4c22c4adf76e4e2171875b1626cab WatchSource:0}: Error finding container 32fc23240b89ae35dd42386a41ad4defcde4c22c4adf76e4e2171875b1626cab: Status 404 returned error can't find the container with id 32fc23240b89ae35dd42386a41ad4defcde4c22c4adf76e4e2171875b1626cab Feb 15 17:17:01 crc kubenswrapper[4585]: I0215 17:17:01.118179 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:01 crc kubenswrapper[4585]: I0215 17:17:01.477654 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz"] Feb 15 17:17:02 crc kubenswrapper[4585]: I0215 17:17:02.040451 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" event={"ID":"3753870e-944b-4c36-aa33-deb44f4ccb64","Type":"ContainerStarted","Data":"477d7f72ce2bff49479012c485cf847964dffaa55ebba33b58616464e75856f8"} Feb 15 17:17:02 crc kubenswrapper[4585]: I0215 17:17:02.041638 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" event={"ID":"0005f5d7-fa80-4a9c-90d4-dbe50d95a235","Type":"ContainerStarted","Data":"32fc23240b89ae35dd42386a41ad4defcde4c22c4adf76e4e2171875b1626cab"} Feb 15 17:17:07 crc kubenswrapper[4585]: I0215 17:17:07.086692 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" event={"ID":"0005f5d7-fa80-4a9c-90d4-dbe50d95a235","Type":"ContainerStarted","Data":"bd6f48be985e4c7d8f51127317f282fd38098d8a829a89d4c304b4e0bf51c14a"} Feb 15 17:17:07 crc kubenswrapper[4585]: I0215 17:17:07.087459 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:07 crc kubenswrapper[4585]: I0215 17:17:07.089976 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" event={"ID":"3753870e-944b-4c36-aa33-deb44f4ccb64","Type":"ContainerStarted","Data":"d8c7eb9a8b5b1f8ce36bf31aef0ce9a67e1ab534c18d38f538741652129a2b84"} Feb 15 17:17:07 crc kubenswrapper[4585]: I0215 17:17:07.090150 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:07 crc kubenswrapper[4585]: I0215 17:17:07.114471 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" podStartSLOduration=1.584747456 podStartE2EDuration="7.114448042s" podCreationTimestamp="2026-02-15 17:17:00 +0000 UTC" firstStartedPulling="2026-02-15 17:17:01.071152574 +0000 UTC m=+677.014560706" lastFinishedPulling="2026-02-15 17:17:06.60085316 +0000 UTC m=+682.544261292" observedRunningTime="2026-02-15 17:17:07.10558977 +0000 UTC m=+683.048997912" watchObservedRunningTime="2026-02-15 17:17:07.114448042 +0000 UTC m=+683.057856194" Feb 15 17:17:07 crc kubenswrapper[4585]: I0215 17:17:07.126266 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" podStartSLOduration=1.992820009 podStartE2EDuration="7.126251304s" podCreationTimestamp="2026-02-15 17:17:00 +0000 UTC" firstStartedPulling="2026-02-15 17:17:01.490208477 +0000 UTC m=+677.433616609" lastFinishedPulling="2026-02-15 17:17:06.623639772 +0000 UTC m=+682.567047904" observedRunningTime="2026-02-15 17:17:07.125797221 +0000 UTC m=+683.069205363" watchObservedRunningTime="2026-02-15 17:17:07.126251304 +0000 UTC m=+683.069659446" Feb 15 17:17:17 crc kubenswrapper[4585]: I0215 17:17:17.014606 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:17:17 crc kubenswrapper[4585]: I0215 17:17:17.015536 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:17:21 crc kubenswrapper[4585]: I0215 17:17:21.126015 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-577b585f8-g2ggz" Feb 15 17:17:40 crc kubenswrapper[4585]: I0215 17:17:40.745961 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-769664b7f6-hp4wp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.495330 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-nprl8"] Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.498493 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.501320 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.501733 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.501929 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-5sgsj" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.503689 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8"] Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.504902 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.505917 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.525801 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8"] Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.584955 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-mhjlp"] Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.589064 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.591517 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.592180 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.592442 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.592535 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-qdcz6" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.610637 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-metrics\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.610691 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-frr-sockets\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.610854 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bfd\" (UniqueName: \"kubernetes.io/projected/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-kube-api-access-t8bfd\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.610953 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-frr-startup\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.611034 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-frr-conf\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.611058 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d563416-a6d9-452b-818e-6129a4343937-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dtjq8\" (UID: \"4d563416-a6d9-452b-818e-6129a4343937\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.611132 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d54ws\" (UniqueName: \"kubernetes.io/projected/4d563416-a6d9-452b-818e-6129a4343937-kube-api-access-d54ws\") pod \"frr-k8s-webhook-server-78b44bf5bb-dtjq8\" (UID: \"4d563416-a6d9-452b-818e-6129a4343937\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.611229 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-metrics-certs\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.611256 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-reloader\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.624562 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-pk5r2"] Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.625939 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.627860 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.632763 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-pk5r2"] Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714433 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-frr-startup\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714485 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-memberlist\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714508 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b78ad86-3000-4ba4-b544-09b5890298e9-cert\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714540 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-reloader\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714561 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-metrics\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714580 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-metallb-excludel2\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714632 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b78ad86-3000-4ba4-b544-09b5890298e9-metrics-certs\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714659 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bfd\" (UniqueName: \"kubernetes.io/projected/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-kube-api-access-t8bfd\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714683 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d563416-a6d9-452b-818e-6129a4343937-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dtjq8\" (UID: \"4d563416-a6d9-452b-818e-6129a4343937\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714699 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-frr-conf\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714719 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d54ws\" (UniqueName: \"kubernetes.io/projected/4d563416-a6d9-452b-818e-6129a4343937-kube-api-access-d54ws\") pod \"frr-k8s-webhook-server-78b44bf5bb-dtjq8\" (UID: \"4d563416-a6d9-452b-818e-6129a4343937\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714740 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-metrics-certs\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714760 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-metrics-certs\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714776 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9nth\" (UniqueName: \"kubernetes.io/projected/8b78ad86-3000-4ba4-b544-09b5890298e9-kube-api-access-r9nth\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714796 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dsg7\" (UniqueName: \"kubernetes.io/projected/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-kube-api-access-6dsg7\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.714812 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-frr-sockets\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.716796 4585 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.716878 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d563416-a6d9-452b-818e-6129a4343937-cert podName:4d563416-a6d9-452b-818e-6129a4343937 nodeName:}" failed. No retries permitted until 2026-02-15 17:17:42.216857027 +0000 UTC m=+718.160265159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4d563416-a6d9-452b-818e-6129a4343937-cert") pod "frr-k8s-webhook-server-78b44bf5bb-dtjq8" (UID: "4d563416-a6d9-452b-818e-6129a4343937") : secret "frr-k8s-webhook-server-cert" not found Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.716893 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-frr-startup\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.717102 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-frr-conf\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.717301 4585 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.717341 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-metrics-certs podName:89f3a2fa-b24d-4bbc-ad36-9727e45e2e52 nodeName:}" failed. No retries permitted until 2026-02-15 17:17:42.21732793 +0000 UTC m=+718.160736062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-metrics-certs") pod "frr-k8s-nprl8" (UID: "89f3a2fa-b24d-4bbc-ad36-9727e45e2e52") : secret "frr-k8s-certs-secret" not found Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.717388 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-metrics\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.717547 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-frr-sockets\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.718399 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-reloader\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.737117 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bfd\" (UniqueName: \"kubernetes.io/projected/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-kube-api-access-t8bfd\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.747330 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d54ws\" (UniqueName: \"kubernetes.io/projected/4d563416-a6d9-452b-818e-6129a4343937-kube-api-access-d54ws\") pod \"frr-k8s-webhook-server-78b44bf5bb-dtjq8\" (UID: \"4d563416-a6d9-452b-818e-6129a4343937\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.816211 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-metallb-excludel2\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.816269 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b78ad86-3000-4ba4-b544-09b5890298e9-metrics-certs\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.816374 4585 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.816507 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b78ad86-3000-4ba4-b544-09b5890298e9-metrics-certs podName:8b78ad86-3000-4ba4-b544-09b5890298e9 nodeName:}" failed. No retries permitted until 2026-02-15 17:17:42.316491846 +0000 UTC m=+718.259899978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b78ad86-3000-4ba4-b544-09b5890298e9-metrics-certs") pod "controller-69bbfbf88f-pk5r2" (UID: "8b78ad86-3000-4ba4-b544-09b5890298e9") : secret "controller-certs-secret" not found Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.816734 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-metrics-certs\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.816758 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9nth\" (UniqueName: \"kubernetes.io/projected/8b78ad86-3000-4ba4-b544-09b5890298e9-kube-api-access-r9nth\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.816821 4585 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.816844 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-metrics-certs podName:34c11c17-aea3-4d9f-8a9b-f84da1d1a1af nodeName:}" failed. No retries permitted until 2026-02-15 17:17:42.316837246 +0000 UTC m=+718.260245378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-metrics-certs") pod "speaker-mhjlp" (UID: "34c11c17-aea3-4d9f-8a9b-f84da1d1a1af") : secret "speaker-certs-secret" not found Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.816862 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dsg7\" (UniqueName: \"kubernetes.io/projected/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-kube-api-access-6dsg7\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.816905 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-memberlist\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.816919 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b78ad86-3000-4ba4-b544-09b5890298e9-cert\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.816959 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-metallb-excludel2\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.817132 4585 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 15 17:17:41 crc kubenswrapper[4585]: E0215 17:17:41.817195 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-memberlist podName:34c11c17-aea3-4d9f-8a9b-f84da1d1a1af nodeName:}" failed. No retries permitted until 2026-02-15 17:17:42.317178125 +0000 UTC m=+718.260586257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-memberlist") pod "speaker-mhjlp" (UID: "34c11c17-aea3-4d9f-8a9b-f84da1d1a1af") : secret "metallb-memberlist" not found Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.821930 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b78ad86-3000-4ba4-b544-09b5890298e9-cert\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.833474 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dsg7\" (UniqueName: \"kubernetes.io/projected/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-kube-api-access-6dsg7\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:41 crc kubenswrapper[4585]: I0215 17:17:41.835181 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9nth\" (UniqueName: \"kubernetes.io/projected/8b78ad86-3000-4ba4-b544-09b5890298e9-kube-api-access-r9nth\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.224698 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d563416-a6d9-452b-818e-6129a4343937-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dtjq8\" (UID: \"4d563416-a6d9-452b-818e-6129a4343937\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.224968 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-metrics-certs\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.227901 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d563416-a6d9-452b-818e-6129a4343937-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dtjq8\" (UID: \"4d563416-a6d9-452b-818e-6129a4343937\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.228347 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f3a2fa-b24d-4bbc-ad36-9727e45e2e52-metrics-certs\") pod \"frr-k8s-nprl8\" (UID: \"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52\") " pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.326512 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-metrics-certs\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.326585 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-memberlist\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.326645 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b78ad86-3000-4ba4-b544-09b5890298e9-metrics-certs\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:42 crc kubenswrapper[4585]: E0215 17:17:42.327200 4585 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 15 17:17:42 crc kubenswrapper[4585]: E0215 17:17:42.327248 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-memberlist podName:34c11c17-aea3-4d9f-8a9b-f84da1d1a1af nodeName:}" failed. No retries permitted until 2026-02-15 17:17:43.327233835 +0000 UTC m=+719.270641967 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-memberlist") pod "speaker-mhjlp" (UID: "34c11c17-aea3-4d9f-8a9b-f84da1d1a1af") : secret "metallb-memberlist" not found Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.330627 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-metrics-certs\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.331275 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b78ad86-3000-4ba4-b544-09b5890298e9-metrics-certs\") pod \"controller-69bbfbf88f-pk5r2\" (UID: \"8b78ad86-3000-4ba4-b544-09b5890298e9\") " pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.419060 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.424465 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.541955 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.656950 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8"] Feb 15 17:17:42 crc kubenswrapper[4585]: I0215 17:17:42.799932 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-pk5r2"] Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.343104 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-memberlist\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.349053 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/34c11c17-aea3-4d9f-8a9b-f84da1d1a1af-memberlist\") pod \"speaker-mhjlp\" (UID: \"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af\") " pod="metallb-system/speaker-mhjlp" Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.401332 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-mhjlp" Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.403470 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-pk5r2" event={"ID":"8b78ad86-3000-4ba4-b544-09b5890298e9","Type":"ContainerStarted","Data":"5d20cd435852e8ddffab268ce9bf9da649878b95198f55db0ca7f33813b490ad"} Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.403512 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-pk5r2" event={"ID":"8b78ad86-3000-4ba4-b544-09b5890298e9","Type":"ContainerStarted","Data":"11e01fd684c18103e1108f341f518d89835ce2e3dde2f8a6dba0791c5a5852f0"} Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.403522 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-pk5r2" event={"ID":"8b78ad86-3000-4ba4-b544-09b5890298e9","Type":"ContainerStarted","Data":"033369757b4df4d885537b4b026ccd520d4e23c3e9bebe9434a93a9f21eca9c0"} Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.404494 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.405456 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerStarted","Data":"24032794db1fd45e6c519b8b2d66b762f304e84a6d96088235fb923335caa14e"} Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.406295 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" event={"ID":"4d563416-a6d9-452b-818e-6129a4343937","Type":"ContainerStarted","Data":"e83a9f79c755f3729c07477c46eaec487e39526c405085b0ba926b006d9cdea5"} Feb 15 17:17:43 crc kubenswrapper[4585]: W0215 17:17:43.430540 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34c11c17_aea3_4d9f_8a9b_f84da1d1a1af.slice/crio-cd3f7f87625c179bdfedd64c459761d39b18168ec9892cf792c85e416fa78054 WatchSource:0}: Error finding container cd3f7f87625c179bdfedd64c459761d39b18168ec9892cf792c85e416fa78054: Status 404 returned error can't find the container with id cd3f7f87625c179bdfedd64c459761d39b18168ec9892cf792c85e416fa78054 Feb 15 17:17:43 crc kubenswrapper[4585]: I0215 17:17:43.432534 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-pk5r2" podStartSLOduration=2.432518284 podStartE2EDuration="2.432518284s" podCreationTimestamp="2026-02-15 17:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:17:43.425114279 +0000 UTC m=+719.368522421" watchObservedRunningTime="2026-02-15 17:17:43.432518284 +0000 UTC m=+719.375926416" Feb 15 17:17:44 crc kubenswrapper[4585]: I0215 17:17:44.415909 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mhjlp" event={"ID":"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af","Type":"ContainerStarted","Data":"040d5fdfc00084e8d8e947473fcbd27347c62eaef89cc0ea772e521e816a05cd"} Feb 15 17:17:44 crc kubenswrapper[4585]: I0215 17:17:44.415952 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mhjlp" event={"ID":"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af","Type":"ContainerStarted","Data":"a5939c68eedad17a667ee76f3636acefa32038fc913445f80ab2ee5a4d615bd1"} Feb 15 17:17:44 crc kubenswrapper[4585]: I0215 17:17:44.415963 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-mhjlp" event={"ID":"34c11c17-aea3-4d9f-8a9b-f84da1d1a1af","Type":"ContainerStarted","Data":"cd3f7f87625c179bdfedd64c459761d39b18168ec9892cf792c85e416fa78054"} Feb 15 17:17:44 crc kubenswrapper[4585]: I0215 17:17:44.416188 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-mhjlp" Feb 15 17:17:44 crc kubenswrapper[4585]: I0215 17:17:44.464084 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-mhjlp" podStartSLOduration=3.46406578 podStartE2EDuration="3.46406578s" podCreationTimestamp="2026-02-15 17:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:17:44.451029629 +0000 UTC m=+720.394437761" watchObservedRunningTime="2026-02-15 17:17:44.46406578 +0000 UTC m=+720.407473912" Feb 15 17:17:47 crc kubenswrapper[4585]: I0215 17:17:47.013912 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:17:47 crc kubenswrapper[4585]: I0215 17:17:47.014212 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:17:50 crc kubenswrapper[4585]: I0215 17:17:50.466107 4585 generic.go:334] "Generic (PLEG): container finished" podID="89f3a2fa-b24d-4bbc-ad36-9727e45e2e52" containerID="24ff580ba1a4086753de9855e65e8db30cf85825933e236163a8e94e76367033" exitCode=0 Feb 15 17:17:50 crc kubenswrapper[4585]: I0215 17:17:50.466157 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerDied","Data":"24ff580ba1a4086753de9855e65e8db30cf85825933e236163a8e94e76367033"} Feb 15 17:17:50 crc kubenswrapper[4585]: I0215 17:17:50.467951 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" event={"ID":"4d563416-a6d9-452b-818e-6129a4343937","Type":"ContainerStarted","Data":"41b1f7cf726c319a3e571580a6fa1f8fedc3dad8bf3f55347f8ceeb8ea9110ab"} Feb 15 17:17:50 crc kubenswrapper[4585]: I0215 17:17:50.468042 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:17:50 crc kubenswrapper[4585]: I0215 17:17:50.521989 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" podStartSLOduration=2.214162326 podStartE2EDuration="9.521973718s" podCreationTimestamp="2026-02-15 17:17:41 +0000 UTC" firstStartedPulling="2026-02-15 17:17:42.663989204 +0000 UTC m=+718.607397336" lastFinishedPulling="2026-02-15 17:17:49.971800596 +0000 UTC m=+725.915208728" observedRunningTime="2026-02-15 17:17:50.521707041 +0000 UTC m=+726.465115173" watchObservedRunningTime="2026-02-15 17:17:50.521973718 +0000 UTC m=+726.465381850" Feb 15 17:17:51 crc kubenswrapper[4585]: I0215 17:17:51.479356 4585 generic.go:334] "Generic (PLEG): container finished" podID="89f3a2fa-b24d-4bbc-ad36-9727e45e2e52" containerID="1afbb4d03b4266539e0aff4f543f0a72135f139d520c79d9e4d3fb06f65f81ff" exitCode=0 Feb 15 17:17:51 crc kubenswrapper[4585]: I0215 17:17:51.479427 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerDied","Data":"1afbb4d03b4266539e0aff4f543f0a72135f139d520c79d9e4d3fb06f65f81ff"} Feb 15 17:17:52 crc kubenswrapper[4585]: I0215 17:17:52.489806 4585 generic.go:334] "Generic (PLEG): container finished" podID="89f3a2fa-b24d-4bbc-ad36-9727e45e2e52" containerID="4d1bde50eb47020a1d4c1ac0621ddf5ba9e45072631350e39a33a79113f2b983" exitCode=0 Feb 15 17:17:52 crc kubenswrapper[4585]: I0215 17:17:52.489921 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerDied","Data":"4d1bde50eb47020a1d4c1ac0621ddf5ba9e45072631350e39a33a79113f2b983"} Feb 15 17:17:52 crc kubenswrapper[4585]: I0215 17:17:52.550408 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-pk5r2" Feb 15 17:17:53 crc kubenswrapper[4585]: I0215 17:17:53.407794 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-mhjlp" Feb 15 17:17:53 crc kubenswrapper[4585]: I0215 17:17:53.505642 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerStarted","Data":"f43e347070ed019985a953cf06443a9e8c6e39f4508e08868cce41f4935a7f29"} Feb 15 17:17:53 crc kubenswrapper[4585]: I0215 17:17:53.505739 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerStarted","Data":"34f4dc0d1070dd4bb8ebaf17810912e0d7f496bd04d2adfeb2ed2ea31e712013"} Feb 15 17:17:53 crc kubenswrapper[4585]: I0215 17:17:53.505761 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerStarted","Data":"0257d5c065e44fa407231827483858f3a86d9e4daf75d9512e72bf7e17c50aa2"} Feb 15 17:17:53 crc kubenswrapper[4585]: I0215 17:17:53.505778 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerStarted","Data":"1a97c6a6b8ffaf63579d616365966abcce772d541fcbe808b47ad37d86bd84d0"} Feb 15 17:17:53 crc kubenswrapper[4585]: I0215 17:17:53.505794 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerStarted","Data":"69a71ef9c956049bfaa0d89da802ba0ce29169b3943e5ee4c403a42e640bf89e"} Feb 15 17:17:54 crc kubenswrapper[4585]: I0215 17:17:54.520592 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nprl8" event={"ID":"89f3a2fa-b24d-4bbc-ad36-9727e45e2e52","Type":"ContainerStarted","Data":"fed05c1e4a1b0634b367da40d83c35e97f09279176f09229bde807f3f14e90e7"} Feb 15 17:17:54 crc kubenswrapper[4585]: I0215 17:17:54.520981 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:54 crc kubenswrapper[4585]: I0215 17:17:54.560155 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-nprl8" podStartSLOduration=6.150609863 podStartE2EDuration="13.560130283s" podCreationTimestamp="2026-02-15 17:17:41 +0000 UTC" firstStartedPulling="2026-02-15 17:17:42.586047115 +0000 UTC m=+718.529455247" lastFinishedPulling="2026-02-15 17:17:49.995567525 +0000 UTC m=+725.938975667" observedRunningTime="2026-02-15 17:17:54.547946046 +0000 UTC m=+730.491354208" watchObservedRunningTime="2026-02-15 17:17:54.560130283 +0000 UTC m=+730.503538445" Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.366673 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-8z5rx"] Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.368570 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8z5rx" Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.373993 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9qvsd" Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.374250 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.374373 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.380382 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8z5rx"] Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.504530 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nggrf\" (UniqueName: \"kubernetes.io/projected/1be6dab7-a399-4e9b-91cc-c36bfbfef7ec-kube-api-access-nggrf\") pod \"openstack-operator-index-8z5rx\" (UID: \"1be6dab7-a399-4e9b-91cc-c36bfbfef7ec\") " pod="openstack-operators/openstack-operator-index-8z5rx" Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.606509 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nggrf\" (UniqueName: \"kubernetes.io/projected/1be6dab7-a399-4e9b-91cc-c36bfbfef7ec-kube-api-access-nggrf\") pod \"openstack-operator-index-8z5rx\" (UID: \"1be6dab7-a399-4e9b-91cc-c36bfbfef7ec\") " pod="openstack-operators/openstack-operator-index-8z5rx" Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.628403 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nggrf\" (UniqueName: \"kubernetes.io/projected/1be6dab7-a399-4e9b-91cc-c36bfbfef7ec-kube-api-access-nggrf\") pod \"openstack-operator-index-8z5rx\" (UID: \"1be6dab7-a399-4e9b-91cc-c36bfbfef7ec\") " pod="openstack-operators/openstack-operator-index-8z5rx" Feb 15 17:17:56 crc kubenswrapper[4585]: I0215 17:17:56.717370 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8z5rx" Feb 15 17:17:57 crc kubenswrapper[4585]: I0215 17:17:57.195281 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-8z5rx"] Feb 15 17:17:57 crc kubenswrapper[4585]: I0215 17:17:57.420478 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:57 crc kubenswrapper[4585]: I0215 17:17:57.468739 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-nprl8" Feb 15 17:17:57 crc kubenswrapper[4585]: I0215 17:17:57.547533 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8z5rx" event={"ID":"1be6dab7-a399-4e9b-91cc-c36bfbfef7ec","Type":"ContainerStarted","Data":"b4881d055e31383bab644c3b3b9c9d95ce3f194d632efbdc2cdf68ed3fbe9765"} Feb 15 17:17:59 crc kubenswrapper[4585]: I0215 17:17:59.535448 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-8z5rx"] Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.150733 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-svdn7"] Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.151918 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-svdn7" Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.170943 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-svdn7"] Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.183052 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k2f6\" (UniqueName: \"kubernetes.io/projected/725f7d01-0363-4872-9e03-494df9cdd50a-kube-api-access-9k2f6\") pod \"openstack-operator-index-svdn7\" (UID: \"725f7d01-0363-4872-9e03-494df9cdd50a\") " pod="openstack-operators/openstack-operator-index-svdn7" Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.283985 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k2f6\" (UniqueName: \"kubernetes.io/projected/725f7d01-0363-4872-9e03-494df9cdd50a-kube-api-access-9k2f6\") pod \"openstack-operator-index-svdn7\" (UID: \"725f7d01-0363-4872-9e03-494df9cdd50a\") " pod="openstack-operators/openstack-operator-index-svdn7" Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.320888 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k2f6\" (UniqueName: \"kubernetes.io/projected/725f7d01-0363-4872-9e03-494df9cdd50a-kube-api-access-9k2f6\") pod \"openstack-operator-index-svdn7\" (UID: \"725f7d01-0363-4872-9e03-494df9cdd50a\") " pod="openstack-operators/openstack-operator-index-svdn7" Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.511974 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-svdn7" Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.591027 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8z5rx" event={"ID":"1be6dab7-a399-4e9b-91cc-c36bfbfef7ec","Type":"ContainerStarted","Data":"858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd"} Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.591182 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-8z5rx" podUID="1be6dab7-a399-4e9b-91cc-c36bfbfef7ec" containerName="registry-server" containerID="cri-o://858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd" gracePeriod=2 Feb 15 17:18:00 crc kubenswrapper[4585]: I0215 17:18:00.645979 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-8z5rx" podStartSLOduration=1.7374804240000001 podStartE2EDuration="4.645956655s" podCreationTimestamp="2026-02-15 17:17:56 +0000 UTC" firstStartedPulling="2026-02-15 17:17:57.184511345 +0000 UTC m=+733.127919487" lastFinishedPulling="2026-02-15 17:18:00.092987586 +0000 UTC m=+736.036395718" observedRunningTime="2026-02-15 17:18:00.637020106 +0000 UTC m=+736.580428238" watchObservedRunningTime="2026-02-15 17:18:00.645956655 +0000 UTC m=+736.589364787" Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.015272 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8z5rx" Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.090279 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-svdn7"] Feb 15 17:18:01 crc kubenswrapper[4585]: W0215 17:18:01.105710 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod725f7d01_0363_4872_9e03_494df9cdd50a.slice/crio-9b6b45754a0e2f2a8982c2b19369b19c4efa5c84388b4d5af30527484cb6a224 WatchSource:0}: Error finding container 9b6b45754a0e2f2a8982c2b19369b19c4efa5c84388b4d5af30527484cb6a224: Status 404 returned error can't find the container with id 9b6b45754a0e2f2a8982c2b19369b19c4efa5c84388b4d5af30527484cb6a224 Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.196575 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nggrf\" (UniqueName: \"kubernetes.io/projected/1be6dab7-a399-4e9b-91cc-c36bfbfef7ec-kube-api-access-nggrf\") pod \"1be6dab7-a399-4e9b-91cc-c36bfbfef7ec\" (UID: \"1be6dab7-a399-4e9b-91cc-c36bfbfef7ec\") " Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.202709 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be6dab7-a399-4e9b-91cc-c36bfbfef7ec-kube-api-access-nggrf" (OuterVolumeSpecName: "kube-api-access-nggrf") pod "1be6dab7-a399-4e9b-91cc-c36bfbfef7ec" (UID: "1be6dab7-a399-4e9b-91cc-c36bfbfef7ec"). InnerVolumeSpecName "kube-api-access-nggrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.298766 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nggrf\" (UniqueName: \"kubernetes.io/projected/1be6dab7-a399-4e9b-91cc-c36bfbfef7ec-kube-api-access-nggrf\") on node \"crc\" DevicePath \"\"" Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.604936 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-svdn7" event={"ID":"725f7d01-0363-4872-9e03-494df9cdd50a","Type":"ContainerStarted","Data":"9b6b45754a0e2f2a8982c2b19369b19c4efa5c84388b4d5af30527484cb6a224"} Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.609010 4585 generic.go:334] "Generic (PLEG): container finished" podID="1be6dab7-a399-4e9b-91cc-c36bfbfef7ec" containerID="858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd" exitCode=0 Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.609067 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8z5rx" event={"ID":"1be6dab7-a399-4e9b-91cc-c36bfbfef7ec","Type":"ContainerDied","Data":"858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd"} Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.609106 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-8z5rx" event={"ID":"1be6dab7-a399-4e9b-91cc-c36bfbfef7ec","Type":"ContainerDied","Data":"b4881d055e31383bab644c3b3b9c9d95ce3f194d632efbdc2cdf68ed3fbe9765"} Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.609136 4585 scope.go:117] "RemoveContainer" containerID="858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd" Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.609169 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-8z5rx" Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.639867 4585 scope.go:117] "RemoveContainer" containerID="858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd" Feb 15 17:18:01 crc kubenswrapper[4585]: E0215 17:18:01.640477 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd\": container with ID starting with 858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd not found: ID does not exist" containerID="858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd" Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.640544 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd"} err="failed to get container status \"858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd\": rpc error: code = NotFound desc = could not find container \"858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd\": container with ID starting with 858e0f5fb1575b08e0b359633b0d6b0e325e1cb6a4623a33531383f903908bdd not found: ID does not exist" Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.662531 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-8z5rx"] Feb 15 17:18:01 crc kubenswrapper[4585]: I0215 17:18:01.672924 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-8z5rx"] Feb 15 17:18:02 crc kubenswrapper[4585]: I0215 17:18:02.423693 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-nprl8" Feb 15 17:18:02 crc kubenswrapper[4585]: I0215 17:18:02.429636 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dtjq8" Feb 15 17:18:02 crc kubenswrapper[4585]: I0215 17:18:02.618567 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-svdn7" event={"ID":"725f7d01-0363-4872-9e03-494df9cdd50a","Type":"ContainerStarted","Data":"dd5f6e00e7d5700c588edfd130e501cb295b0e59ededa812b887d960b2291fbb"} Feb 15 17:18:02 crc kubenswrapper[4585]: I0215 17:18:02.635946 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-svdn7" podStartSLOduration=1.6441429159999998 podStartE2EDuration="2.635931021s" podCreationTimestamp="2026-02-15 17:18:00 +0000 UTC" firstStartedPulling="2026-02-15 17:18:01.110363519 +0000 UTC m=+737.053771651" lastFinishedPulling="2026-02-15 17:18:02.102151614 +0000 UTC m=+738.045559756" observedRunningTime="2026-02-15 17:18:02.631988342 +0000 UTC m=+738.575396474" watchObservedRunningTime="2026-02-15 17:18:02.635931021 +0000 UTC m=+738.579339153" Feb 15 17:18:02 crc kubenswrapper[4585]: I0215 17:18:02.862016 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be6dab7-a399-4e9b-91cc-c36bfbfef7ec" path="/var/lib/kubelet/pods/1be6dab7-a399-4e9b-91cc-c36bfbfef7ec/volumes" Feb 15 17:18:10 crc kubenswrapper[4585]: I0215 17:18:10.513147 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-svdn7" Feb 15 17:18:10 crc kubenswrapper[4585]: I0215 17:18:10.514111 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-svdn7" Feb 15 17:18:10 crc kubenswrapper[4585]: I0215 17:18:10.572127 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-svdn7" Feb 15 17:18:10 crc kubenswrapper[4585]: I0215 17:18:10.721306 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-svdn7" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.221815 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq"] Feb 15 17:18:12 crc kubenswrapper[4585]: E0215 17:18:12.222513 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be6dab7-a399-4e9b-91cc-c36bfbfef7ec" containerName="registry-server" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.222530 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be6dab7-a399-4e9b-91cc-c36bfbfef7ec" containerName="registry-server" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.222787 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be6dab7-a399-4e9b-91cc-c36bfbfef7ec" containerName="registry-server" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.224217 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.227226 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4kfrm" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.242965 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq"] Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.415522 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4kh5\" (UniqueName: \"kubernetes.io/projected/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-kube-api-access-f4kh5\") pod \"3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.415636 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-bundle\") pod \"3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.415691 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-util\") pod \"3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.517711 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4kh5\" (UniqueName: \"kubernetes.io/projected/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-kube-api-access-f4kh5\") pod \"3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.517767 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-bundle\") pod \"3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.517816 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-util\") pod \"3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.518332 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-util\") pod \"3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.518587 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-bundle\") pod \"3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.550583 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4kh5\" (UniqueName: \"kubernetes.io/projected/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-kube-api-access-f4kh5\") pod \"3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:12 crc kubenswrapper[4585]: I0215 17:18:12.604353 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:13 crc kubenswrapper[4585]: I0215 17:18:13.042955 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq"] Feb 15 17:18:13 crc kubenswrapper[4585]: W0215 17:18:13.048010 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9fb975a_75bb_42c7_862e_50a72ccb6c1e.slice/crio-719f8d1baed9a973db2177f4a0c525bc4f040e3efa606632c7fb2cabe28232f4 WatchSource:0}: Error finding container 719f8d1baed9a973db2177f4a0c525bc4f040e3efa606632c7fb2cabe28232f4: Status 404 returned error can't find the container with id 719f8d1baed9a973db2177f4a0c525bc4f040e3efa606632c7fb2cabe28232f4 Feb 15 17:18:13 crc kubenswrapper[4585]: I0215 17:18:13.733169 4585 generic.go:334] "Generic (PLEG): container finished" podID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerID="9bfa8b4b8c36f00a75220eac06a560d17e2bf8678b98d2923ba518a57e67954d" exitCode=0 Feb 15 17:18:13 crc kubenswrapper[4585]: I0215 17:18:13.733288 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" event={"ID":"b9fb975a-75bb-42c7-862e-50a72ccb6c1e","Type":"ContainerDied","Data":"9bfa8b4b8c36f00a75220eac06a560d17e2bf8678b98d2923ba518a57e67954d"} Feb 15 17:18:13 crc kubenswrapper[4585]: I0215 17:18:13.733578 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" event={"ID":"b9fb975a-75bb-42c7-862e-50a72ccb6c1e","Type":"ContainerStarted","Data":"719f8d1baed9a973db2177f4a0c525bc4f040e3efa606632c7fb2cabe28232f4"} Feb 15 17:18:14 crc kubenswrapper[4585]: I0215 17:18:14.745483 4585 generic.go:334] "Generic (PLEG): container finished" podID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerID="75afc8f45107cef40e1b59d62d17241d2a802890e19078aab474f975495946d9" exitCode=0 Feb 15 17:18:14 crc kubenswrapper[4585]: I0215 17:18:14.745584 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" event={"ID":"b9fb975a-75bb-42c7-862e-50a72ccb6c1e","Type":"ContainerDied","Data":"75afc8f45107cef40e1b59d62d17241d2a802890e19078aab474f975495946d9"} Feb 15 17:18:16 crc kubenswrapper[4585]: E0215 17:18:16.727842 4585 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.887s" Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.014591 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.014669 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.014713 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.015345 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66bd3998ff2493b6c4431c56b818df1c025a69a1e07091de641e0ebe4853beee"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.015402 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://66bd3998ff2493b6c4431c56b818df1c025a69a1e07091de641e0ebe4853beee" gracePeriod=600 Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.749157 4585 generic.go:334] "Generic (PLEG): container finished" podID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerID="a9e9d82e9dfc009d152c038656ac8298ca475558b312bffa1427788a9a2eab2f" exitCode=0 Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.749216 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" event={"ID":"b9fb975a-75bb-42c7-862e-50a72ccb6c1e","Type":"ContainerDied","Data":"a9e9d82e9dfc009d152c038656ac8298ca475558b312bffa1427788a9a2eab2f"} Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.751808 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="66bd3998ff2493b6c4431c56b818df1c025a69a1e07091de641e0ebe4853beee" exitCode=0 Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.751858 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"66bd3998ff2493b6c4431c56b818df1c025a69a1e07091de641e0ebe4853beee"} Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.751883 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"0cc3960491aa7365ef9a992dbe57461170d7a99f094dd61fc5fce5575354ba90"} Feb 15 17:18:17 crc kubenswrapper[4585]: I0215 17:18:17.751906 4585 scope.go:117] "RemoveContainer" containerID="3be178554d50f4bf9fd8b4ef83fa0f64425cad682ded41b72f52fbeaa156e13e" Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.132509 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.309587 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4kh5\" (UniqueName: \"kubernetes.io/projected/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-kube-api-access-f4kh5\") pod \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.309703 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-util\") pod \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.309739 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-bundle\") pod \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\" (UID: \"b9fb975a-75bb-42c7-862e-50a72ccb6c1e\") " Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.311556 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-bundle" (OuterVolumeSpecName: "bundle") pod "b9fb975a-75bb-42c7-862e-50a72ccb6c1e" (UID: "b9fb975a-75bb-42c7-862e-50a72ccb6c1e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.312207 4585 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.324274 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-util" (OuterVolumeSpecName: "util") pod "b9fb975a-75bb-42c7-862e-50a72ccb6c1e" (UID: "b9fb975a-75bb-42c7-862e-50a72ccb6c1e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.333709 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-kube-api-access-f4kh5" (OuterVolumeSpecName: "kube-api-access-f4kh5") pod "b9fb975a-75bb-42c7-862e-50a72ccb6c1e" (UID: "b9fb975a-75bb-42c7-862e-50a72ccb6c1e"). InnerVolumeSpecName "kube-api-access-f4kh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.413629 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4kh5\" (UniqueName: \"kubernetes.io/projected/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-kube-api-access-f4kh5\") on node \"crc\" DevicePath \"\"" Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.413923 4585 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fb975a-75bb-42c7-862e-50a72ccb6c1e-util\") on node \"crc\" DevicePath \"\"" Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.786393 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" event={"ID":"b9fb975a-75bb-42c7-862e-50a72ccb6c1e","Type":"ContainerDied","Data":"719f8d1baed9a973db2177f4a0c525bc4f040e3efa606632c7fb2cabe28232f4"} Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.786460 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq" Feb 15 17:18:19 crc kubenswrapper[4585]: I0215 17:18:19.786458 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="719f8d1baed9a973db2177f4a0c525bc4f040e3efa606632c7fb2cabe28232f4" Feb 15 17:18:23 crc kubenswrapper[4585]: I0215 17:18:23.040344 4585 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.235034 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l"] Feb 15 17:18:24 crc kubenswrapper[4585]: E0215 17:18:24.235765 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerName="pull" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.235791 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerName="pull" Feb 15 17:18:24 crc kubenswrapper[4585]: E0215 17:18:24.235846 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerName="util" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.235859 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerName="util" Feb 15 17:18:24 crc kubenswrapper[4585]: E0215 17:18:24.235888 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerName="extract" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.235906 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerName="extract" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.237626 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9fb975a-75bb-42c7-862e-50a72ccb6c1e" containerName="extract" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.238559 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.248111 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-clw2m" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.263045 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l"] Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.393592 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr9bg\" (UniqueName: \"kubernetes.io/projected/e70b72bf-8467-4e87-b021-be653a6d218e-kube-api-access-dr9bg\") pod \"openstack-operator-controller-init-567dc79d78-vrx9l\" (UID: \"e70b72bf-8467-4e87-b021-be653a6d218e\") " pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.494958 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr9bg\" (UniqueName: \"kubernetes.io/projected/e70b72bf-8467-4e87-b021-be653a6d218e-kube-api-access-dr9bg\") pod \"openstack-operator-controller-init-567dc79d78-vrx9l\" (UID: \"e70b72bf-8467-4e87-b021-be653a6d218e\") " pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.514624 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr9bg\" (UniqueName: \"kubernetes.io/projected/e70b72bf-8467-4e87-b021-be653a6d218e-kube-api-access-dr9bg\") pod \"openstack-operator-controller-init-567dc79d78-vrx9l\" (UID: \"e70b72bf-8467-4e87-b021-be653a6d218e\") " pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.572696 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" Feb 15 17:18:24 crc kubenswrapper[4585]: I0215 17:18:24.919336 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l"] Feb 15 17:18:25 crc kubenswrapper[4585]: I0215 17:18:25.850649 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" event={"ID":"e70b72bf-8467-4e87-b021-be653a6d218e","Type":"ContainerStarted","Data":"9a3cdfeca0a076afad34639ca0b5ef7a9dec5831d200924e1d00c1c5bd2189c5"} Feb 15 17:18:29 crc kubenswrapper[4585]: I0215 17:18:29.878831 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" event={"ID":"e70b72bf-8467-4e87-b021-be653a6d218e","Type":"ContainerStarted","Data":"1203f60971b1ea058caca81d336f43e919397929c423aa7a730979234986f890"} Feb 15 17:18:29 crc kubenswrapper[4585]: I0215 17:18:29.879517 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" Feb 15 17:18:29 crc kubenswrapper[4585]: I0215 17:18:29.918593 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" podStartSLOduration=1.375179661 podStartE2EDuration="5.918565482s" podCreationTimestamp="2026-02-15 17:18:24 +0000 UTC" firstStartedPulling="2026-02-15 17:18:24.930956035 +0000 UTC m=+760.874364157" lastFinishedPulling="2026-02-15 17:18:29.474341846 +0000 UTC m=+765.417749978" observedRunningTime="2026-02-15 17:18:29.911101305 +0000 UTC m=+765.854509437" watchObservedRunningTime="2026-02-15 17:18:29.918565482 +0000 UTC m=+765.861973654" Feb 15 17:18:34 crc kubenswrapper[4585]: I0215 17:18:34.587905 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-567dc79d78-vrx9l" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.009319 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.010878 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.012824 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jlxqr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.016855 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.018219 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.032003 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-pmcgb" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.035726 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.036723 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.040144 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-b94bs" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.062970 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.097797 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.099004 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.120405 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.124955 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-p9jrr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.136651 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.148040 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.149691 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.159588 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-csxwx" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.160455 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bf5j\" (UniqueName: \"kubernetes.io/projected/c1b6598f-2367-4118-9e8c-90018190d1fb-kube-api-access-7bf5j\") pod \"designate-operator-controller-manager-6d8bf5c495-vs4dd\" (UID: \"c1b6598f-2367-4118-9e8c-90018190d1fb\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.160497 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khk84\" (UniqueName: \"kubernetes.io/projected/345af412-80a4-4d2b-9738-9ed005847c6a-kube-api-access-khk84\") pod \"cinder-operator-controller-manager-5d946d989d-p75lp\" (UID: \"345af412-80a4-4d2b-9738-9ed005847c6a\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.160546 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhplk\" (UniqueName: \"kubernetes.io/projected/a0a88360-7506-4420-a652-8abb63a4f2ea-kube-api-access-qhplk\") pod \"barbican-operator-controller-manager-868647ff47-rr6q7\" (UID: \"a0a88360-7506-4420-a652-8abb63a4f2ea\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.165047 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.192246 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.209388 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.210380 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.221087 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wt4tk" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.228641 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.229640 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.242935 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.251658 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.252281 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-x45vx" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.252519 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.262343 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khk84\" (UniqueName: \"kubernetes.io/projected/345af412-80a4-4d2b-9738-9ed005847c6a-kube-api-access-khk84\") pod \"cinder-operator-controller-manager-5d946d989d-p75lp\" (UID: \"345af412-80a4-4d2b-9738-9ed005847c6a\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.262410 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzv5l\" (UniqueName: \"kubernetes.io/projected/72febc8a-8640-43c3-a6d2-5ca8156d827a-kube-api-access-dzv5l\") pod \"glance-operator-controller-manager-77987464f4-j9lfr\" (UID: \"72febc8a-8640-43c3-a6d2-5ca8156d827a\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.262446 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhplk\" (UniqueName: \"kubernetes.io/projected/a0a88360-7506-4420-a652-8abb63a4f2ea-kube-api-access-qhplk\") pod \"barbican-operator-controller-manager-868647ff47-rr6q7\" (UID: \"a0a88360-7506-4420-a652-8abb63a4f2ea\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.262490 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dt86\" (UniqueName: \"kubernetes.io/projected/91b09b7a-0686-4b3b-8aa5-4596b1fb5ec2-kube-api-access-7dt86\") pod \"heat-operator-controller-manager-69f49c598c-zf88x\" (UID: \"91b09b7a-0686-4b3b-8aa5-4596b1fb5ec2\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.262542 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bf5j\" (UniqueName: \"kubernetes.io/projected/c1b6598f-2367-4118-9e8c-90018190d1fb-kube-api-access-7bf5j\") pod \"designate-operator-controller-manager-6d8bf5c495-vs4dd\" (UID: \"c1b6598f-2367-4118-9e8c-90018190d1fb\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.283670 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.284848 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.288395 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-t8mf4" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.306680 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.308403 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.320686 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.331443 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-b5574" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.340951 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.343985 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bf5j\" (UniqueName: \"kubernetes.io/projected/c1b6598f-2367-4118-9e8c-90018190d1fb-kube-api-access-7bf5j\") pod \"designate-operator-controller-manager-6d8bf5c495-vs4dd\" (UID: \"c1b6598f-2367-4118-9e8c-90018190d1fb\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.359365 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhplk\" (UniqueName: \"kubernetes.io/projected/a0a88360-7506-4420-a652-8abb63a4f2ea-kube-api-access-qhplk\") pod \"barbican-operator-controller-manager-868647ff47-rr6q7\" (UID: \"a0a88360-7506-4420-a652-8abb63a4f2ea\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.360224 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khk84\" (UniqueName: \"kubernetes.io/projected/345af412-80a4-4d2b-9738-9ed005847c6a-kube-api-access-khk84\") pod \"cinder-operator-controller-manager-5d946d989d-p75lp\" (UID: \"345af412-80a4-4d2b-9738-9ed005847c6a\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.406356 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dt86\" (UniqueName: \"kubernetes.io/projected/91b09b7a-0686-4b3b-8aa5-4596b1fb5ec2-kube-api-access-7dt86\") pod \"heat-operator-controller-manager-69f49c598c-zf88x\" (UID: \"91b09b7a-0686-4b3b-8aa5-4596b1fb5ec2\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.406572 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krmx8\" (UniqueName: \"kubernetes.io/projected/64500064-0cb9-4c1e-9370-960a7aa9617c-kube-api-access-krmx8\") pod \"horizon-operator-controller-manager-5b9b8895d5-p8qvp\" (UID: \"64500064-0cb9-4c1e-9370-960a7aa9617c\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.421216 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.468646 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncrxs\" (UniqueName: \"kubernetes.io/projected/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-kube-api-access-ncrxs\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.468743 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.468850 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzv5l\" (UniqueName: \"kubernetes.io/projected/72febc8a-8640-43c3-a6d2-5ca8156d827a-kube-api-access-dzv5l\") pod \"glance-operator-controller-manager-77987464f4-j9lfr\" (UID: \"72febc8a-8640-43c3-a6d2-5ca8156d827a\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.486744 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.488229 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.508326 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dt86\" (UniqueName: \"kubernetes.io/projected/91b09b7a-0686-4b3b-8aa5-4596b1fb5ec2-kube-api-access-7dt86\") pod \"heat-operator-controller-manager-69f49c598c-zf88x\" (UID: \"91b09b7a-0686-4b3b-8aa5-4596b1fb5ec2\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.511020 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-qdmmt" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.522293 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzv5l\" (UniqueName: \"kubernetes.io/projected/72febc8a-8640-43c3-a6d2-5ca8156d827a-kube-api-access-dzv5l\") pod \"glance-operator-controller-manager-77987464f4-j9lfr\" (UID: \"72febc8a-8640-43c3-a6d2-5ca8156d827a\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.535177 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.536439 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.539892 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-q97tc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.557944 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.563384 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.569841 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzwjt\" (UniqueName: \"kubernetes.io/projected/cb189c3a-a98e-4fc4-b074-ce2e17b2950b-kube-api-access-bzwjt\") pod \"keystone-operator-controller-manager-b4d948c87-btbfc\" (UID: \"cb189c3a-a98e-4fc4-b074-ce2e17b2950b\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.569879 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqzkf\" (UniqueName: \"kubernetes.io/projected/3d704152-30bf-4588-ba21-bc5f23265fb6-kube-api-access-bqzkf\") pod \"manila-operator-controller-manager-54f6768c69-5cqqm\" (UID: \"3d704152-30bf-4588-ba21-bc5f23265fb6\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.569939 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krmx8\" (UniqueName: \"kubernetes.io/projected/64500064-0cb9-4c1e-9370-960a7aa9617c-kube-api-access-krmx8\") pod \"horizon-operator-controller-manager-5b9b8895d5-p8qvp\" (UID: \"64500064-0cb9-4c1e-9370-960a7aa9617c\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.569964 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncrxs\" (UniqueName: \"kubernetes.io/projected/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-kube-api-access-ncrxs\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.569983 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jvrg\" (UniqueName: \"kubernetes.io/projected/d32edc1b-dcb5-4338-801c-fb1657a78892-kube-api-access-2jvrg\") pod \"ironic-operator-controller-manager-554564d7fc-chqj4\" (UID: \"d32edc1b-dcb5-4338-801c-fb1657a78892\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.570009 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:18:53 crc kubenswrapper[4585]: E0215 17:18:53.570142 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 15 17:18:53 crc kubenswrapper[4585]: E0215 17:18:53.570196 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert podName:9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414 nodeName:}" failed. No retries permitted until 2026-02-15 17:18:54.070176433 +0000 UTC m=+790.013584565 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert") pod "infra-operator-controller-manager-7c4bfc5b96-482zr" (UID: "9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414") : secret "infra-operator-webhook-server-cert" not found Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.581571 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.583369 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.591836 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.598535 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xttlg" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.624806 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncrxs\" (UniqueName: \"kubernetes.io/projected/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-kube-api-access-ncrxs\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.633999 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.634934 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krmx8\" (UniqueName: \"kubernetes.io/projected/64500064-0cb9-4c1e-9370-960a7aa9617c-kube-api-access-krmx8\") pod \"horizon-operator-controller-manager-5b9b8895d5-p8qvp\" (UID: \"64500064-0cb9-4c1e-9370-960a7aa9617c\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.653241 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.654927 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.658182 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.678048 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-hdzjr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.681698 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jvrg\" (UniqueName: \"kubernetes.io/projected/d32edc1b-dcb5-4338-801c-fb1657a78892-kube-api-access-2jvrg\") pod \"ironic-operator-controller-manager-554564d7fc-chqj4\" (UID: \"d32edc1b-dcb5-4338-801c-fb1657a78892\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.681823 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bkwg\" (UniqueName: \"kubernetes.io/projected/1bf81164-cb02-4381-a1e5-b28b7648f613-kube-api-access-7bkwg\") pod \"mariadb-operator-controller-manager-6994f66f48-f2wrc\" (UID: \"1bf81164-cb02-4381-a1e5-b28b7648f613\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.681864 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqp7\" (UniqueName: \"kubernetes.io/projected/abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b-kube-api-access-mfqp7\") pod \"neutron-operator-controller-manager-64ddbf8bb-ndnrr\" (UID: \"abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.681888 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzwjt\" (UniqueName: \"kubernetes.io/projected/cb189c3a-a98e-4fc4-b074-ce2e17b2950b-kube-api-access-bzwjt\") pod \"keystone-operator-controller-manager-b4d948c87-btbfc\" (UID: \"cb189c3a-a98e-4fc4-b074-ce2e17b2950b\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.681914 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqzkf\" (UniqueName: \"kubernetes.io/projected/3d704152-30bf-4588-ba21-bc5f23265fb6-kube-api-access-bqzkf\") pod \"manila-operator-controller-manager-54f6768c69-5cqqm\" (UID: \"3d704152-30bf-4588-ba21-bc5f23265fb6\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.702675 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.703879 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.710023 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bqpfx" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.723676 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.729441 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzwjt\" (UniqueName: \"kubernetes.io/projected/cb189c3a-a98e-4fc4-b074-ce2e17b2950b-kube-api-access-bzwjt\") pod \"keystone-operator-controller-manager-b4d948c87-btbfc\" (UID: \"cb189c3a-a98e-4fc4-b074-ce2e17b2950b\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.730196 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jvrg\" (UniqueName: \"kubernetes.io/projected/d32edc1b-dcb5-4338-801c-fb1657a78892-kube-api-access-2jvrg\") pod \"ironic-operator-controller-manager-554564d7fc-chqj4\" (UID: \"d32edc1b-dcb5-4338-801c-fb1657a78892\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.748167 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.765223 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqzkf\" (UniqueName: \"kubernetes.io/projected/3d704152-30bf-4588-ba21-bc5f23265fb6-kube-api-access-bqzkf\") pod \"manila-operator-controller-manager-54f6768c69-5cqqm\" (UID: \"3d704152-30bf-4588-ba21-bc5f23265fb6\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.783174 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bkwg\" (UniqueName: \"kubernetes.io/projected/1bf81164-cb02-4381-a1e5-b28b7648f613-kube-api-access-7bkwg\") pod \"mariadb-operator-controller-manager-6994f66f48-f2wrc\" (UID: \"1bf81164-cb02-4381-a1e5-b28b7648f613\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.783221 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqp7\" (UniqueName: \"kubernetes.io/projected/abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b-kube-api-access-mfqp7\") pod \"neutron-operator-controller-manager-64ddbf8bb-ndnrr\" (UID: \"abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.783248 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5rvc\" (UniqueName: \"kubernetes.io/projected/58be13ef-eba2-4fdd-9fff-2b96d1b38143-kube-api-access-d5rvc\") pod \"octavia-operator-controller-manager-69f8888797-fvsm5\" (UID: \"58be13ef-eba2-4fdd-9fff-2b96d1b38143\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.783269 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bghrb\" (UniqueName: \"kubernetes.io/projected/afc8afe1-32b1-47ef-9fd4-6331fec926f5-kube-api-access-bghrb\") pod \"nova-operator-controller-manager-567668f5cf-ztxlq\" (UID: \"afc8afe1-32b1-47ef-9fd4-6331fec926f5\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.789718 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.813136 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bkwg\" (UniqueName: \"kubernetes.io/projected/1bf81164-cb02-4381-a1e5-b28b7648f613-kube-api-access-7bkwg\") pod \"mariadb-operator-controller-manager-6994f66f48-f2wrc\" (UID: \"1bf81164-cb02-4381-a1e5-b28b7648f613\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.823160 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqp7\" (UniqueName: \"kubernetes.io/projected/abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b-kube-api-access-mfqp7\") pod \"neutron-operator-controller-manager-64ddbf8bb-ndnrr\" (UID: \"abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.857941 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.865676 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.885387 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5rvc\" (UniqueName: \"kubernetes.io/projected/58be13ef-eba2-4fdd-9fff-2b96d1b38143-kube-api-access-d5rvc\") pod \"octavia-operator-controller-manager-69f8888797-fvsm5\" (UID: \"58be13ef-eba2-4fdd-9fff-2b96d1b38143\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.885434 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bghrb\" (UniqueName: \"kubernetes.io/projected/afc8afe1-32b1-47ef-9fd4-6331fec926f5-kube-api-access-bghrb\") pod \"nova-operator-controller-manager-567668f5cf-ztxlq\" (UID: \"afc8afe1-32b1-47ef-9fd4-6331fec926f5\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.907507 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.908989 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.910025 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.911207 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.916863 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.917265 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-6x2s7" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.936134 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.941311 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5rvc\" (UniqueName: \"kubernetes.io/projected/58be13ef-eba2-4fdd-9fff-2b96d1b38143-kube-api-access-d5rvc\") pod \"octavia-operator-controller-manager-69f8888797-fvsm5\" (UID: \"58be13ef-eba2-4fdd-9fff-2b96d1b38143\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.941394 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.943453 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bghrb\" (UniqueName: \"kubernetes.io/projected/afc8afe1-32b1-47ef-9fd4-6331fec926f5-kube-api-access-bghrb\") pod \"nova-operator-controller-manager-567668f5cf-ztxlq\" (UID: \"afc8afe1-32b1-47ef-9fd4-6331fec926f5\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.952879 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.955420 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-zn5vs" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.967786 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.981566 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.985568 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt"] Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.986798 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:18:53 crc kubenswrapper[4585]: I0215 17:18:53.986863 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67tjt\" (UniqueName: \"kubernetes.io/projected/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-kube-api-access-67tjt\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.000506 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.001924 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.005819 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-b6v9s" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.019576 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.032433 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.044166 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.053139 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-5dh24"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.054541 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.061049 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.063387 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-gcbxr" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.065549 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-5dh24"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.072326 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.073526 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.091227 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4p9t5" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.091399 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj98w\" (UniqueName: \"kubernetes.io/projected/e447b2b4-5bfe-4481-a36a-241124fd507a-kube-api-access-fj98w\") pod \"placement-operator-controller-manager-8497b45c89-7s6mk\" (UID: \"e447b2b4-5bfe-4481-a36a-241124fd507a\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.091496 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.091542 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-896zb\" (UniqueName: \"kubernetes.io/projected/7796cdbe-ab03-4bac-b2cc-e828e42f438f-kube-api-access-896zb\") pod \"ovn-operator-controller-manager-d44cf6b75-8tzvv\" (UID: \"7796cdbe-ab03-4bac-b2cc-e828e42f438f\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.091568 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67tjt\" (UniqueName: \"kubernetes.io/projected/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-kube-api-access-67tjt\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.091608 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.091739 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.091792 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert podName:9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414 nodeName:}" failed. No retries permitted until 2026-02-15 17:18:55.091777722 +0000 UTC m=+791.035185854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert") pod "infra-operator-controller-manager-7c4bfc5b96-482zr" (UID: "9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414") : secret "infra-operator-webhook-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.092053 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.092115 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert podName:4378a5c2-4e4a-422a-9cd5-b55433ac3fbe nodeName:}" failed. No retries permitted until 2026-02-15 17:18:54.592090691 +0000 UTC m=+790.535498823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" (UID: "4378a5c2-4e4a-422a-9cd5-b55433ac3fbe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.104661 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.111537 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2l54v"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.112768 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.115556 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-ln57p" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.145066 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67tjt\" (UniqueName: \"kubernetes.io/projected/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-kube-api-access-67tjt\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.199071 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4pl\" (UniqueName: \"kubernetes.io/projected/cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8-kube-api-access-8k4pl\") pod \"test-operator-controller-manager-7866795846-2l54v\" (UID: \"cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.199125 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj98w\" (UniqueName: \"kubernetes.io/projected/e447b2b4-5bfe-4481-a36a-241124fd507a-kube-api-access-fj98w\") pod \"placement-operator-controller-manager-8497b45c89-7s6mk\" (UID: \"e447b2b4-5bfe-4481-a36a-241124fd507a\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.199162 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt9cx\" (UniqueName: \"kubernetes.io/projected/8a5c9ab0-1600-4130-8293-6672efc2188d-kube-api-access-qt9cx\") pod \"telemetry-operator-controller-manager-7f45b4ff68-rh8kk\" (UID: \"8a5c9ab0-1600-4130-8293-6672efc2188d\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.199268 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-896zb\" (UniqueName: \"kubernetes.io/projected/7796cdbe-ab03-4bac-b2cc-e828e42f438f-kube-api-access-896zb\") pod \"ovn-operator-controller-manager-d44cf6b75-8tzvv\" (UID: \"7796cdbe-ab03-4bac-b2cc-e828e42f438f\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.199295 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xwqm\" (UniqueName: \"kubernetes.io/projected/58ebd0ec-b27f-493b-accb-7c43c2408f19-kube-api-access-9xwqm\") pod \"swift-operator-controller-manager-68f46476f-5dh24\" (UID: \"58ebd0ec-b27f-493b-accb-7c43c2408f19\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.220522 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj98w\" (UniqueName: \"kubernetes.io/projected/e447b2b4-5bfe-4481-a36a-241124fd507a-kube-api-access-fj98w\") pod \"placement-operator-controller-manager-8497b45c89-7s6mk\" (UID: \"e447b2b4-5bfe-4481-a36a-241124fd507a\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.250669 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2l54v"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.267582 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.272525 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.275103 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-896zb\" (UniqueName: \"kubernetes.io/projected/7796cdbe-ab03-4bac-b2cc-e828e42f438f-kube-api-access-896zb\") pod \"ovn-operator-controller-manager-d44cf6b75-8tzvv\" (UID: \"7796cdbe-ab03-4bac-b2cc-e828e42f438f\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.281435 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-g4gbr" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.328916 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4pl\" (UniqueName: \"kubernetes.io/projected/cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8-kube-api-access-8k4pl\") pod \"test-operator-controller-manager-7866795846-2l54v\" (UID: \"cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.329189 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt9cx\" (UniqueName: \"kubernetes.io/projected/8a5c9ab0-1600-4130-8293-6672efc2188d-kube-api-access-qt9cx\") pod \"telemetry-operator-controller-manager-7f45b4ff68-rh8kk\" (UID: \"8a5c9ab0-1600-4130-8293-6672efc2188d\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.331910 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.339296 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xwqm\" (UniqueName: \"kubernetes.io/projected/58ebd0ec-b27f-493b-accb-7c43c2408f19-kube-api-access-9xwqm\") pod \"swift-operator-controller-manager-68f46476f-5dh24\" (UID: \"58ebd0ec-b27f-493b-accb-7c43c2408f19\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.355256 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4pl\" (UniqueName: \"kubernetes.io/projected/cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8-kube-api-access-8k4pl\") pod \"test-operator-controller-manager-7866795846-2l54v\" (UID: \"cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8\") " pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.363304 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xwqm\" (UniqueName: \"kubernetes.io/projected/58ebd0ec-b27f-493b-accb-7c43c2408f19-kube-api-access-9xwqm\") pod \"swift-operator-controller-manager-68f46476f-5dh24\" (UID: \"58ebd0ec-b27f-493b-accb-7c43c2408f19\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.364523 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.366193 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt9cx\" (UniqueName: \"kubernetes.io/projected/8a5c9ab0-1600-4130-8293-6672efc2188d-kube-api-access-qt9cx\") pod \"telemetry-operator-controller-manager-7f45b4ff68-rh8kk\" (UID: \"8a5c9ab0-1600-4130-8293-6672efc2188d\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.371261 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.420998 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.422514 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.425617 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.426475 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6hjzh" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.427389 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.427706 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.435009 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.449558 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq788\" (UniqueName: \"kubernetes.io/projected/82f1699a-e706-412e-af47-89b0ed090f92-kube-api-access-lq788\") pod \"watcher-operator-controller-manager-5db88f68c-vvfcn\" (UID: \"82f1699a-e706-412e-af47-89b0ed090f92\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.452549 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.467075 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.488420 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.496571 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.497844 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.505437 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-b4gn8" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.551005 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdss9\" (UniqueName: \"kubernetes.io/projected/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-kube-api-access-rdss9\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.551055 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.551124 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.551170 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq788\" (UniqueName: \"kubernetes.io/projected/82f1699a-e706-412e-af47-89b0ed090f92-kube-api-access-lq788\") pod \"watcher-operator-controller-manager-5db88f68c-vvfcn\" (UID: \"82f1699a-e706-412e-af47-89b0ed090f92\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.568762 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq788\" (UniqueName: \"kubernetes.io/projected/82f1699a-e706-412e-af47-89b0ed090f92-kube-api-access-lq788\") pod \"watcher-operator-controller-manager-5db88f68c-vvfcn\" (UID: \"82f1699a-e706-412e-af47-89b0ed090f92\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.628327 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.634077 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.653993 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.654144 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w96q\" (UniqueName: \"kubernetes.io/projected/144ca353-5e11-4eab-a29e-71e41e63ea9f-kube-api-access-6w96q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4jh78\" (UID: \"144ca353-5e11-4eab-a29e-71e41e63ea9f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78" Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.654164 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.654182 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdss9\" (UniqueName: \"kubernetes.io/projected/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-kube-api-access-rdss9\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.654212 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.654243 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:18:55.154220033 +0000 UTC m=+791.097628165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "webhook-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.654283 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.654367 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.654438 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:18:55.154417169 +0000 UTC m=+791.097825301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "metrics-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.654708 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: E0215 17:18:54.654747 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert podName:4378a5c2-4e4a-422a-9cd5-b55433ac3fbe nodeName:}" failed. No retries permitted until 2026-02-15 17:18:55.654739418 +0000 UTC m=+791.598147550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" (UID: "4378a5c2-4e4a-422a-9cd5-b55433ac3fbe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.699777 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdss9\" (UniqueName: \"kubernetes.io/projected/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-kube-api-access-rdss9\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.701298 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.717544 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.759231 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w96q\" (UniqueName: \"kubernetes.io/projected/144ca353-5e11-4eab-a29e-71e41e63ea9f-kube-api-access-6w96q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4jh78\" (UID: \"144ca353-5e11-4eab-a29e-71e41e63ea9f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.778811 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr"] Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.784511 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w96q\" (UniqueName: \"kubernetes.io/projected/144ca353-5e11-4eab-a29e-71e41e63ea9f-kube-api-access-6w96q\") pod \"rabbitmq-cluster-operator-manager-668c99d594-4jh78\" (UID: \"144ca353-5e11-4eab-a29e-71e41e63ea9f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.839818 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78" Feb 15 17:18:54 crc kubenswrapper[4585]: I0215 17:18:54.933667 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.065420 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.122487 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.140186 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.163559 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" event={"ID":"72febc8a-8640-43c3-a6d2-5ca8156d827a","Type":"ContainerStarted","Data":"5aff2622a51e116a87386a053207df122df5b1ae13042f7ac593df584d153677"} Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.168990 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" event={"ID":"afc8afe1-32b1-47ef-9fd4-6331fec926f5","Type":"ContainerStarted","Data":"d6c93be8eab072ee631f76d284658abb8897374fecb38792575bdb1550575e0a"} Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.169828 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" event={"ID":"a0a88360-7506-4420-a652-8abb63a4f2ea","Type":"ContainerStarted","Data":"ef5ba855fe718216ddca36df78d44db6ca5d72fdfe323919a9e2fc4b10b702c2"} Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.171187 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.171315 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.171402 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.171424 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.171449 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert podName:9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414 nodeName:}" failed. No retries permitted until 2026-02-15 17:18:57.17143185 +0000 UTC m=+793.114839982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert") pod "infra-operator-controller-manager-7c4bfc5b96-482zr" (UID: "9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414") : secret "infra-operator-webhook-server-cert" not found Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.171566 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.171636 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:18:56.171622456 +0000 UTC m=+792.115030588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "webhook-server-cert" not found Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.173177 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" event={"ID":"345af412-80a4-4d2b-9738-9ed005847c6a","Type":"ContainerStarted","Data":"51ac1d7564482e5b24f901be1c9fb537cc918e4adfe2fc263cdfe19f0ed3ef7a"} Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.176108 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" event={"ID":"c1b6598f-2367-4118-9e8c-90018190d1fb","Type":"ContainerStarted","Data":"862f54ada794ca7e48c4264ff67ffb5835ed1bc2ae3b3b4a0196f238371a2bd7"} Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.177764 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.177851 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:18:56.177833749 +0000 UTC m=+792.121241881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "metrics-server-cert" not found Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.218924 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.443001 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.446916 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.546191 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv"] Feb 15 17:18:55 crc kubenswrapper[4585]: W0215 17:18:55.555729 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7796cdbe_ab03_4bac_b2cc_e828e42f438f.slice/crio-f657dc10038eafbc96adad511c918f47778a239a9131a20957a456f4a241297a WatchSource:0}: Error finding container f657dc10038eafbc96adad511c918f47778a239a9131a20957a456f4a241297a: Status 404 returned error can't find the container with id f657dc10038eafbc96adad511c918f47778a239a9131a20957a456f4a241297a Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.701484 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.701866 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.702093 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert podName:4378a5c2-4e4a-422a-9cd5-b55433ac3fbe nodeName:}" failed. No retries permitted until 2026-02-15 17:18:57.702077161 +0000 UTC m=+793.645485293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" (UID: "4378a5c2-4e4a-422a-9cd5-b55433ac3fbe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.722094 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-5dh24"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.739528 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.739576 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.747996 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc"] Feb 15 17:18:55 crc kubenswrapper[4585]: W0215 17:18:55.794311 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bf81164_cb02_4381_a1e5_b28b7648f613.slice/crio-7793fbae93b9466b7e84f0df69574b8083eddc9f0f991f7f8bf75e02f1858fb9 WatchSource:0}: Error finding container 7793fbae93b9466b7e84f0df69574b8083eddc9f0f991f7f8bf75e02f1858fb9: Status 404 returned error can't find the container with id 7793fbae93b9466b7e84f0df69574b8083eddc9f0f991f7f8bf75e02f1858fb9 Feb 15 17:18:55 crc kubenswrapper[4585]: W0215 17:18:55.809722 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d704152_30bf_4588_ba21_bc5f23265fb6.slice/crio-dfc0369c8376707bb9b11fda54f5f30131b4723bf4b131b2216b73b1cd3e48f9 WatchSource:0}: Error finding container dfc0369c8376707bb9b11fda54f5f30131b4723bf4b131b2216b73b1cd3e48f9: Status 404 returned error can't find the container with id dfc0369c8376707bb9b11fda54f5f30131b4723bf4b131b2216b73b1cd3e48f9 Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.814287 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bqzkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-5cqqm_openstack-operators(3d704152-30bf-4588-ba21-bc5f23265fb6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.815500 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" podUID="3d704152-30bf-4588-ba21-bc5f23265fb6" Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.854285 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk"] Feb 15 17:18:55 crc kubenswrapper[4585]: W0215 17:18:55.857241 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a5c9ab0_1600_4130_8293_6672efc2188d.slice/crio-0eeebaa193639a05305b61fc43ce794cbdb5091f1f7db3134075d8dc5b680567 WatchSource:0}: Error finding container 0eeebaa193639a05305b61fc43ce794cbdb5091f1f7db3134075d8dc5b680567: Status 404 returned error can't find the container with id 0eeebaa193639a05305b61fc43ce794cbdb5091f1f7db3134075d8dc5b680567 Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.859389 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qt9cx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-rh8kk_openstack-operators(8a5c9ab0-1600-4130-8293-6672efc2188d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.861660 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" podUID="8a5c9ab0-1600-4130-8293-6672efc2188d" Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.917955 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.927711 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-2l54v"] Feb 15 17:18:55 crc kubenswrapper[4585]: I0215 17:18:55.942995 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78"] Feb 15 17:18:55 crc kubenswrapper[4585]: W0215 17:18:55.971566 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82f1699a_e706_412e_af47_89b0ed090f92.slice/crio-cb028ef83e555ce3d4102364fc6d2b3ecc8376d37de3314656778ffc890c606d WatchSource:0}: Error finding container cb028ef83e555ce3d4102364fc6d2b3ecc8376d37de3314656778ffc890c606d: Status 404 returned error can't find the container with id cb028ef83e555ce3d4102364fc6d2b3ecc8376d37de3314656778ffc890c606d Feb 15 17:18:55 crc kubenswrapper[4585]: W0215 17:18:55.971838 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod144ca353_5e11_4eab_a29e_71e41e63ea9f.slice/crio-65bbe4c2cc259fd2b69c105eb0868f63e329a502841c80efe20112070783e3f2 WatchSource:0}: Error finding container 65bbe4c2cc259fd2b69c105eb0868f63e329a502841c80efe20112070783e3f2: Status 404 returned error can't find the container with id 65bbe4c2cc259fd2b69c105eb0868f63e329a502841c80efe20112070783e3f2 Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.978831 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lq788,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-vvfcn_openstack-operators(82f1699a-e706-412e-af47-89b0ed090f92): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.979973 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" podUID="82f1699a-e706-412e-af47-89b0ed090f92" Feb 15 17:18:55 crc kubenswrapper[4585]: W0215 17:18:55.981766 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc9f7d61_65f7_44d8_8fc6_2e0c8c32cdf8.slice/crio-adcf4f5cc7c52aec7b58f561cb00fd7396ca144d35dae8631ac64ea5128e24ab WatchSource:0}: Error finding container adcf4f5cc7c52aec7b58f561cb00fd7396ca144d35dae8631ac64ea5128e24ab: Status 404 returned error can't find the container with id adcf4f5cc7c52aec7b58f561cb00fd7396ca144d35dae8631ac64ea5128e24ab Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.990921 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8k4pl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-2l54v_openstack-operators(cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 15 17:18:55 crc kubenswrapper[4585]: E0215 17:18:55.992211 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" podUID="cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8" Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.198530 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" event={"ID":"82f1699a-e706-412e-af47-89b0ed090f92","Type":"ContainerStarted","Data":"cb028ef83e555ce3d4102364fc6d2b3ecc8376d37de3314656778ffc890c606d"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.201975 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" event={"ID":"58ebd0ec-b27f-493b-accb-7c43c2408f19","Type":"ContainerStarted","Data":"1dcb1c39df89a5c3d66544802bc8d3de2d0007cf0d0eb5b756f5c6fa16f99738"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.208202 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.208282 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:56 crc kubenswrapper[4585]: E0215 17:18:56.208470 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 15 17:18:56 crc kubenswrapper[4585]: E0215 17:18:56.208520 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:18:58.208506641 +0000 UTC m=+794.151914773 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "webhook-server-cert" not found Feb 15 17:18:56 crc kubenswrapper[4585]: E0215 17:18:56.208836 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 15 17:18:56 crc kubenswrapper[4585]: E0215 17:18:56.208865 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:18:58.20885862 +0000 UTC m=+794.152266752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "metrics-server-cert" not found Feb 15 17:18:56 crc kubenswrapper[4585]: E0215 17:18:56.208965 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" podUID="82f1699a-e706-412e-af47-89b0ed090f92" Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.218080 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" event={"ID":"7796cdbe-ab03-4bac-b2cc-e828e42f438f","Type":"ContainerStarted","Data":"f657dc10038eafbc96adad511c918f47778a239a9131a20957a456f4a241297a"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.224983 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" event={"ID":"abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b","Type":"ContainerStarted","Data":"d1e7e7cde73cc1f26a68530d5e10af8331073e24049bda6e2c0b39697419a2fd"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.228652 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" event={"ID":"58be13ef-eba2-4fdd-9fff-2b96d1b38143","Type":"ContainerStarted","Data":"95b22006e5eb6bf04e2df06ee224729cde2b77a5302bdc9395a5e30449fa6297"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.234463 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" event={"ID":"3d704152-30bf-4588-ba21-bc5f23265fb6","Type":"ContainerStarted","Data":"dfc0369c8376707bb9b11fda54f5f30131b4723bf4b131b2216b73b1cd3e48f9"} Feb 15 17:18:56 crc kubenswrapper[4585]: E0215 17:18:56.241272 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" podUID="3d704152-30bf-4588-ba21-bc5f23265fb6" Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.243335 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" event={"ID":"91b09b7a-0686-4b3b-8aa5-4596b1fb5ec2","Type":"ContainerStarted","Data":"40fc5997ad777920b49c4689034e76c44afacb7efb063f4c7b86e6433c2fc097"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.265559 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" event={"ID":"1bf81164-cb02-4381-a1e5-b28b7648f613","Type":"ContainerStarted","Data":"7793fbae93b9466b7e84f0df69574b8083eddc9f0f991f7f8bf75e02f1858fb9"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.271997 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78" event={"ID":"144ca353-5e11-4eab-a29e-71e41e63ea9f","Type":"ContainerStarted","Data":"65bbe4c2cc259fd2b69c105eb0868f63e329a502841c80efe20112070783e3f2"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.277400 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" event={"ID":"e447b2b4-5bfe-4481-a36a-241124fd507a","Type":"ContainerStarted","Data":"1e331cb594935d6d2dbcca88e7c7e64a522d63a7744cd9dedc2e51038d2dfe90"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.279636 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" event={"ID":"d32edc1b-dcb5-4338-801c-fb1657a78892","Type":"ContainerStarted","Data":"c7c5f73f9979e94356a6d975229ee9f19d30c6dddfc71c179c177c65c467c323"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.288853 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" event={"ID":"cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8","Type":"ContainerStarted","Data":"adcf4f5cc7c52aec7b58f561cb00fd7396ca144d35dae8631ac64ea5128e24ab"} Feb 15 17:18:56 crc kubenswrapper[4585]: E0215 17:18:56.290405 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" podUID="cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8" Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.290862 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" event={"ID":"cb189c3a-a98e-4fc4-b074-ce2e17b2950b","Type":"ContainerStarted","Data":"dacb53d2eba6a40ca5fea39c5bf6b7a8f6e671907093736def7bed1f66a489ae"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.294264 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" event={"ID":"64500064-0cb9-4c1e-9370-960a7aa9617c","Type":"ContainerStarted","Data":"6657cc52363a994d2c1eb34de3ec474aeaf78d1c7f5fd08b359af03ff3c7f6fe"} Feb 15 17:18:56 crc kubenswrapper[4585]: I0215 17:18:56.296069 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" event={"ID":"8a5c9ab0-1600-4130-8293-6672efc2188d","Type":"ContainerStarted","Data":"0eeebaa193639a05305b61fc43ce794cbdb5091f1f7db3134075d8dc5b680567"} Feb 15 17:18:56 crc kubenswrapper[4585]: E0215 17:18:56.302144 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" podUID="8a5c9ab0-1600-4130-8293-6672efc2188d" Feb 15 17:18:57 crc kubenswrapper[4585]: I0215 17:18:57.220204 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:18:57 crc kubenswrapper[4585]: E0215 17:18:57.220348 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 15 17:18:57 crc kubenswrapper[4585]: E0215 17:18:57.220526 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert podName:9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414 nodeName:}" failed. No retries permitted until 2026-02-15 17:19:01.220513375 +0000 UTC m=+797.163921507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert") pod "infra-operator-controller-manager-7c4bfc5b96-482zr" (UID: "9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414") : secret "infra-operator-webhook-server-cert" not found Feb 15 17:18:57 crc kubenswrapper[4585]: E0215 17:18:57.310283 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" podUID="cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8" Feb 15 17:18:57 crc kubenswrapper[4585]: E0215 17:18:57.310302 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" podUID="82f1699a-e706-412e-af47-89b0ed090f92" Feb 15 17:18:57 crc kubenswrapper[4585]: E0215 17:18:57.310330 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" podUID="3d704152-30bf-4588-ba21-bc5f23265fb6" Feb 15 17:18:57 crc kubenswrapper[4585]: E0215 17:18:57.310395 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" podUID="8a5c9ab0-1600-4130-8293-6672efc2188d" Feb 15 17:18:57 crc kubenswrapper[4585]: I0215 17:18:57.730492 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:18:57 crc kubenswrapper[4585]: E0215 17:18:57.730818 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:18:57 crc kubenswrapper[4585]: E0215 17:18:57.730872 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert podName:4378a5c2-4e4a-422a-9cd5-b55433ac3fbe nodeName:}" failed. No retries permitted until 2026-02-15 17:19:01.730856613 +0000 UTC m=+797.674264745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" (UID: "4378a5c2-4e4a-422a-9cd5-b55433ac3fbe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:18:58 crc kubenswrapper[4585]: I0215 17:18:58.239508 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:58 crc kubenswrapper[4585]: I0215 17:18:58.239656 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:18:58 crc kubenswrapper[4585]: E0215 17:18:58.239892 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 15 17:18:58 crc kubenswrapper[4585]: E0215 17:18:58.239983 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:19:02.239961086 +0000 UTC m=+798.183369218 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "webhook-server-cert" not found Feb 15 17:18:58 crc kubenswrapper[4585]: E0215 17:18:58.240470 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 15 17:18:58 crc kubenswrapper[4585]: E0215 17:18:58.240511 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:19:02.240502681 +0000 UTC m=+798.183910803 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "metrics-server-cert" not found Feb 15 17:19:01 crc kubenswrapper[4585]: I0215 17:19:01.302399 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:19:01 crc kubenswrapper[4585]: E0215 17:19:01.302794 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 15 17:19:01 crc kubenswrapper[4585]: E0215 17:19:01.302883 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert podName:9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414 nodeName:}" failed. No retries permitted until 2026-02-15 17:19:09.302861486 +0000 UTC m=+805.246269618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert") pod "infra-operator-controller-manager-7c4bfc5b96-482zr" (UID: "9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414") : secret "infra-operator-webhook-server-cert" not found Feb 15 17:19:01 crc kubenswrapper[4585]: I0215 17:19:01.809127 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:19:01 crc kubenswrapper[4585]: E0215 17:19:01.809280 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:19:01 crc kubenswrapper[4585]: E0215 17:19:01.809558 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert podName:4378a5c2-4e4a-422a-9cd5-b55433ac3fbe nodeName:}" failed. No retries permitted until 2026-02-15 17:19:09.809540932 +0000 UTC m=+805.752949064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert") pod "openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" (UID: "4378a5c2-4e4a-422a-9cd5-b55433ac3fbe") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 15 17:19:02 crc kubenswrapper[4585]: I0215 17:19:02.317365 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:19:02 crc kubenswrapper[4585]: I0215 17:19:02.317468 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:19:02 crc kubenswrapper[4585]: E0215 17:19:02.317558 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 15 17:19:02 crc kubenswrapper[4585]: E0215 17:19:02.317638 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 15 17:19:02 crc kubenswrapper[4585]: E0215 17:19:02.317645 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:19:10.317627637 +0000 UTC m=+806.261035769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "metrics-server-cert" not found Feb 15 17:19:02 crc kubenswrapper[4585]: E0215 17:19:02.317710 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs podName:4a8513b3-0e8a-44d3-9fe6-781cac50db0a nodeName:}" failed. No retries permitted until 2026-02-15 17:19:10.317690609 +0000 UTC m=+806.261098731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs") pod "openstack-operator-controller-manager-66bb5545bf-x7sb4" (UID: "4a8513b3-0e8a-44d3-9fe6-781cac50db0a") : secret "webhook-server-cert" not found Feb 15 17:19:09 crc kubenswrapper[4585]: I0215 17:19:09.347841 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:19:09 crc kubenswrapper[4585]: I0215 17:19:09.359175 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414-cert\") pod \"infra-operator-controller-manager-7c4bfc5b96-482zr\" (UID: \"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414\") " pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:19:09 crc kubenswrapper[4585]: I0215 17:19:09.469190 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:19:09 crc kubenswrapper[4585]: I0215 17:19:09.856387 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:19:09 crc kubenswrapper[4585]: I0215 17:19:09.870144 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4378a5c2-4e4a-422a-9cd5-b55433ac3fbe-cert\") pod \"openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt\" (UID: \"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:19:09 crc kubenswrapper[4585]: I0215 17:19:09.875641 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:19:10 crc kubenswrapper[4585]: I0215 17:19:10.365412 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:19:10 crc kubenswrapper[4585]: I0215 17:19:10.365511 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:19:10 crc kubenswrapper[4585]: I0215 17:19:10.370333 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-metrics-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:19:10 crc kubenswrapper[4585]: I0215 17:19:10.371004 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a8513b3-0e8a-44d3-9fe6-781cac50db0a-webhook-certs\") pod \"openstack-operator-controller-manager-66bb5545bf-x7sb4\" (UID: \"4a8513b3-0e8a-44d3-9fe6-781cac50db0a\") " pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:19:10 crc kubenswrapper[4585]: I0215 17:19:10.379245 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:19:11 crc kubenswrapper[4585]: E0215 17:19:11.167890 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da" Feb 15 17:19:11 crc kubenswrapper[4585]: E0215 17:19:11.168338 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krmx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-p8qvp_openstack-operators(64500064-0cb9-4c1e-9370-960a7aa9617c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:19:11 crc kubenswrapper[4585]: E0215 17:19:11.170318 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" podUID="64500064-0cb9-4c1e-9370-960a7aa9617c" Feb 15 17:19:11 crc kubenswrapper[4585]: E0215 17:19:11.441191 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" podUID="64500064-0cb9-4c1e-9370-960a7aa9617c" Feb 15 17:19:14 crc kubenswrapper[4585]: E0215 17:19:14.614930 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979" Feb 15 17:19:14 crc kubenswrapper[4585]: E0215 17:19:14.615909 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-khk84,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5d946d989d-p75lp_openstack-operators(345af412-80a4-4d2b-9738-9ed005847c6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:19:14 crc kubenswrapper[4585]: E0215 17:19:14.617314 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" podUID="345af412-80a4-4d2b-9738-9ed005847c6a" Feb 15 17:19:15 crc kubenswrapper[4585]: E0215 17:19:15.423704 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04" Feb 15 17:19:15 crc kubenswrapper[4585]: E0215 17:19:15.423845 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9xwqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-5dh24_openstack-operators(58ebd0ec-b27f-493b-accb-7c43c2408f19): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:19:15 crc kubenswrapper[4585]: E0215 17:19:15.424967 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" podUID="58ebd0ec-b27f-493b-accb-7c43c2408f19" Feb 15 17:19:15 crc kubenswrapper[4585]: E0215 17:19:15.475824 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" podUID="345af412-80a4-4d2b-9738-9ed005847c6a" Feb 15 17:19:15 crc kubenswrapper[4585]: E0215 17:19:15.476674 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" podUID="58ebd0ec-b27f-493b-accb-7c43c2408f19" Feb 15 17:19:16 crc kubenswrapper[4585]: E0215 17:19:16.238542 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 15 17:19:16 crc kubenswrapper[4585]: E0215 17:19:16.238955 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mfqp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-ndnrr_openstack-operators(abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:19:16 crc kubenswrapper[4585]: E0215 17:19:16.240141 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" podUID="abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b" Feb 15 17:19:16 crc kubenswrapper[4585]: E0215 17:19:16.486263 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" podUID="abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b" Feb 15 17:19:18 crc kubenswrapper[4585]: E0215 17:19:18.604353 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 15 17:19:18 crc kubenswrapper[4585]: E0215 17:19:18.605095 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2jvrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-chqj4_openstack-operators(d32edc1b-dcb5-4338-801c-fb1657a78892): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:19:18 crc kubenswrapper[4585]: E0215 17:19:18.606854 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" podUID="d32edc1b-dcb5-4338-801c-fb1657a78892" Feb 15 17:19:19 crc kubenswrapper[4585]: E0215 17:19:19.182792 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 15 17:19:19 crc kubenswrapper[4585]: E0215 17:19:19.183030 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bzwjt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-btbfc_openstack-operators(cb189c3a-a98e-4fc4-b074-ce2e17b2950b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:19:19 crc kubenswrapper[4585]: E0215 17:19:19.184130 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" podUID="cb189c3a-a98e-4fc4-b074-ce2e17b2950b" Feb 15 17:19:19 crc kubenswrapper[4585]: E0215 17:19:19.508842 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" podUID="d32edc1b-dcb5-4338-801c-fb1657a78892" Feb 15 17:19:19 crc kubenswrapper[4585]: E0215 17:19:19.509172 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" podUID="cb189c3a-a98e-4fc4-b074-ce2e17b2950b" Feb 15 17:19:21 crc kubenswrapper[4585]: E0215 17:19:21.051027 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 15 17:19:21 crc kubenswrapper[4585]: E0215 17:19:21.051307 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bghrb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-ztxlq_openstack-operators(afc8afe1-32b1-47ef-9fd4-6331fec926f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:19:21 crc kubenswrapper[4585]: E0215 17:19:21.052438 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" podUID="afc8afe1-32b1-47ef-9fd4-6331fec926f5" Feb 15 17:19:21 crc kubenswrapper[4585]: E0215 17:19:21.528323 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" podUID="afc8afe1-32b1-47ef-9fd4-6331fec926f5" Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.565430 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt"] Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.587551 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" event={"ID":"72febc8a-8640-43c3-a6d2-5ca8156d827a","Type":"ContainerStarted","Data":"3992f3c3c5aafc30b83862ba52643eabb2e99d48d38ef10d102346d9d96889e0"} Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.590574 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.640939 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" event={"ID":"e447b2b4-5bfe-4481-a36a-241124fd507a","Type":"ContainerStarted","Data":"e051deea31991b9147e3c078391f4c17fea09ad7961cd3274950666114c92ff3"} Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.641004 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.652230 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" event={"ID":"a0a88360-7506-4420-a652-8abb63a4f2ea","Type":"ContainerStarted","Data":"1253dd327daf40883ca8dd59b4d82c11ad71179534d400455f628ecd81d5120e"} Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.652513 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.668703 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" podStartSLOduration=4.691666017 podStartE2EDuration="29.668686147s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:54.909216257 +0000 UTC m=+790.852624389" lastFinishedPulling="2026-02-15 17:19:19.886236387 +0000 UTC m=+815.829644519" observedRunningTime="2026-02-15 17:19:22.614754463 +0000 UTC m=+818.558162595" watchObservedRunningTime="2026-02-15 17:19:22.668686147 +0000 UTC m=+818.612094279" Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.671355 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" podStartSLOduration=5.549962314 podStartE2EDuration="29.671348151s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.763638416 +0000 UTC m=+791.707046548" lastFinishedPulling="2026-02-15 17:19:19.885024253 +0000 UTC m=+815.828432385" observedRunningTime="2026-02-15 17:19:22.657389334 +0000 UTC m=+818.600797466" watchObservedRunningTime="2026-02-15 17:19:22.671348151 +0000 UTC m=+818.614756283" Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.711705 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr"] Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.713971 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" podStartSLOduration=5.623067827 podStartE2EDuration="30.713954331s" podCreationTimestamp="2026-02-15 17:18:52 +0000 UTC" firstStartedPulling="2026-02-15 17:18:54.793306046 +0000 UTC m=+790.736714168" lastFinishedPulling="2026-02-15 17:19:19.88419254 +0000 UTC m=+815.827600672" observedRunningTime="2026-02-15 17:19:22.678117168 +0000 UTC m=+818.621525300" watchObservedRunningTime="2026-02-15 17:19:22.713954331 +0000 UTC m=+818.657362463" Feb 15 17:19:22 crc kubenswrapper[4585]: W0215 17:19:22.782807 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee8eb66_b9ff_4ab0_8fb9_8a85a4511414.slice/crio-cc8edefb434cd7779a4aaf65d0e162633e4f7c63c32c70bbab129d66c963f450 WatchSource:0}: Error finding container cc8edefb434cd7779a4aaf65d0e162633e4f7c63c32c70bbab129d66c963f450: Status 404 returned error can't find the container with id cc8edefb434cd7779a4aaf65d0e162633e4f7c63c32c70bbab129d66c963f450 Feb 15 17:19:22 crc kubenswrapper[4585]: I0215 17:19:22.877578 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4"] Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.658915 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" event={"ID":"4a8513b3-0e8a-44d3-9fe6-781cac50db0a","Type":"ContainerStarted","Data":"912e5a87264ec6439e671791fbf6f008166038d7298daa9c53da5d4e8366b89c"} Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.660200 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" event={"ID":"91b09b7a-0686-4b3b-8aa5-4596b1fb5ec2","Type":"ContainerStarted","Data":"55320c47e38172f6e9d0201ffd6e994d7351f9e81193724683af70e5a9cf9ea3"} Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.660473 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.661395 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" event={"ID":"1bf81164-cb02-4381-a1e5-b28b7648f613","Type":"ContainerStarted","Data":"4a22c82a811b1266ab976bdd2c7446650795b2abf3450b294640b755f47a7e01"} Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.661520 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.662669 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78" event={"ID":"144ca353-5e11-4eab-a29e-71e41e63ea9f","Type":"ContainerStarted","Data":"d11d1a4a6ddc35099d2aea4a32591a46535e9457178e2c68d89ece896c52e70f"} Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.663746 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" event={"ID":"c1b6598f-2367-4118-9e8c-90018190d1fb","Type":"ContainerStarted","Data":"87a90f66f3527081d5ac209c00b2fec911a34a900b5ae864cd3e918f80768deb"} Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.663942 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.664579 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" event={"ID":"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe","Type":"ContainerStarted","Data":"8a37839e0112161d5d64c23a59f4ed1b98861ddf071ae85f32cce2274906c0d8"} Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.665527 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" event={"ID":"58be13ef-eba2-4fdd-9fff-2b96d1b38143","Type":"ContainerStarted","Data":"4f096a80646e5b791e1500c4455dfc2e1f83110a2100c59d30ac68fe9c33d231"} Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.666865 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" event={"ID":"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414","Type":"ContainerStarted","Data":"cc8edefb434cd7779a4aaf65d0e162633e4f7c63c32c70bbab129d66c963f450"} Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.676743 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" podStartSLOduration=6.044307178 podStartE2EDuration="30.676729932s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.255351975 +0000 UTC m=+791.198760107" lastFinishedPulling="2026-02-15 17:19:19.887774649 +0000 UTC m=+815.831182861" observedRunningTime="2026-02-15 17:19:23.675615221 +0000 UTC m=+819.619023363" watchObservedRunningTime="2026-02-15 17:19:23.676729932 +0000 UTC m=+819.620138064" Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.704232 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" podStartSLOduration=5.5534269609999996 podStartE2EDuration="30.704210874s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:54.733772267 +0000 UTC m=+790.677180399" lastFinishedPulling="2026-02-15 17:19:19.88455618 +0000 UTC m=+815.827964312" observedRunningTime="2026-02-15 17:19:23.691067989 +0000 UTC m=+819.634476121" watchObservedRunningTime="2026-02-15 17:19:23.704210874 +0000 UTC m=+819.647618996" Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.719591 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" podStartSLOduration=6.632766239 podStartE2EDuration="30.719573919s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.799069047 +0000 UTC m=+791.742477179" lastFinishedPulling="2026-02-15 17:19:19.885876717 +0000 UTC m=+815.829284859" observedRunningTime="2026-02-15 17:19:23.710779295 +0000 UTC m=+819.654187427" watchObservedRunningTime="2026-02-15 17:19:23.719573919 +0000 UTC m=+819.662982051" Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.749267 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-4jh78" podStartSLOduration=5.841427001 podStartE2EDuration="29.749251211s" podCreationTimestamp="2026-02-15 17:18:54 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.976228126 +0000 UTC m=+791.919636258" lastFinishedPulling="2026-02-15 17:19:19.884052326 +0000 UTC m=+815.827460468" observedRunningTime="2026-02-15 17:19:23.745671801 +0000 UTC m=+819.689079933" watchObservedRunningTime="2026-02-15 17:19:23.749251211 +0000 UTC m=+819.692659343" Feb 15 17:19:23 crc kubenswrapper[4585]: I0215 17:19:23.777360 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" podStartSLOduration=6.190151339 podStartE2EDuration="30.777343359s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.298050209 +0000 UTC m=+791.241458341" lastFinishedPulling="2026-02-15 17:19:19.885242219 +0000 UTC m=+815.828650361" observedRunningTime="2026-02-15 17:19:23.775867368 +0000 UTC m=+819.719275500" watchObservedRunningTime="2026-02-15 17:19:23.777343359 +0000 UTC m=+819.720751491" Feb 15 17:19:24 crc kubenswrapper[4585]: I0215 17:19:24.033002 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.684848 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" event={"ID":"7796cdbe-ab03-4bac-b2cc-e828e42f438f","Type":"ContainerStarted","Data":"5771e3241de5aa961d8b55d175f73769a5df9d20dd5b664cda2eb28b9fcd19a2"} Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.686413 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.693712 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" event={"ID":"64500064-0cb9-4c1e-9370-960a7aa9617c","Type":"ContainerStarted","Data":"24c085b884db102c28cf1c7b2321595d291d1118a05ced50b967fd4e408d9f15"} Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.694208 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.699881 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" event={"ID":"cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8","Type":"ContainerStarted","Data":"451dce8b1e61dcd9f1b069cc9a2b600547b745861717468dbb170b58b82f97dd"} Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.700403 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.701374 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" event={"ID":"8a5c9ab0-1600-4130-8293-6672efc2188d","Type":"ContainerStarted","Data":"9030b60e58974e0721bf4646375af1a47e7a73665d4c94f5a8d842ecfb9a6a5f"} Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.701874 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.704906 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" event={"ID":"3d704152-30bf-4588-ba21-bc5f23265fb6","Type":"ContainerStarted","Data":"8e0832d601cbb599ec9f24f8010631a098732fec0a1dd5c3728da2e9c1a0b6a1"} Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.705360 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.711662 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" event={"ID":"4a8513b3-0e8a-44d3-9fe6-781cac50db0a","Type":"ContainerStarted","Data":"603d807261254304b324e9510bb5bf8087664c0e776e1b93b76c754908a1bc2c"} Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.712333 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.713961 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" event={"ID":"82f1699a-e706-412e-af47-89b0ed090f92","Type":"ContainerStarted","Data":"80c8d414248e5a00afedd83157aceb275275dd6b656f792adc05d9fac3fbf4b6"} Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.714415 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.721309 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" podStartSLOduration=8.393790174 podStartE2EDuration="32.721297371s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.559309086 +0000 UTC m=+791.502717218" lastFinishedPulling="2026-02-15 17:19:19.886816283 +0000 UTC m=+815.830224415" observedRunningTime="2026-02-15 17:19:25.715695256 +0000 UTC m=+821.659103388" watchObservedRunningTime="2026-02-15 17:19:25.721297371 +0000 UTC m=+821.664705503" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.744342 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" podStartSLOduration=5.6142165429999995 podStartE2EDuration="32.744326499s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.50385801 +0000 UTC m=+791.447266142" lastFinishedPulling="2026-02-15 17:19:22.633967966 +0000 UTC m=+818.577376098" observedRunningTime="2026-02-15 17:19:25.743122675 +0000 UTC m=+821.686530807" watchObservedRunningTime="2026-02-15 17:19:25.744326499 +0000 UTC m=+821.687734631" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.847363 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" podStartSLOduration=6.443083106 podStartE2EDuration="32.847342263s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.859288956 +0000 UTC m=+791.802697088" lastFinishedPulling="2026-02-15 17:19:22.263548113 +0000 UTC m=+818.206956245" observedRunningTime="2026-02-15 17:19:25.828009087 +0000 UTC m=+821.771417219" watchObservedRunningTime="2026-02-15 17:19:25.847342263 +0000 UTC m=+821.790750395" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.881643 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" podStartSLOduration=6.680871943 podStartE2EDuration="32.881619272s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.978722615 +0000 UTC m=+791.922130747" lastFinishedPulling="2026-02-15 17:19:22.179469924 +0000 UTC m=+818.122878076" observedRunningTime="2026-02-15 17:19:25.845958674 +0000 UTC m=+821.789366816" watchObservedRunningTime="2026-02-15 17:19:25.881619272 +0000 UTC m=+821.825027414" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.882379 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" podStartSLOduration=6.538477198 podStartE2EDuration="32.882372693s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.814143816 +0000 UTC m=+791.757551938" lastFinishedPulling="2026-02-15 17:19:22.158039291 +0000 UTC m=+818.101447433" observedRunningTime="2026-02-15 17:19:25.864043266 +0000 UTC m=+821.807451398" watchObservedRunningTime="2026-02-15 17:19:25.882372693 +0000 UTC m=+821.825780825" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.922034 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" podStartSLOduration=6.732140702 podStartE2EDuration="32.922017401s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.9908296 +0000 UTC m=+791.934237732" lastFinishedPulling="2026-02-15 17:19:22.180706289 +0000 UTC m=+818.124114431" observedRunningTime="2026-02-15 17:19:25.886412185 +0000 UTC m=+821.829820317" watchObservedRunningTime="2026-02-15 17:19:25.922017401 +0000 UTC m=+821.865425533" Feb 15 17:19:25 crc kubenswrapper[4585]: I0215 17:19:25.949043 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" podStartSLOduration=32.94902825 podStartE2EDuration="32.94902825s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:19:25.942991743 +0000 UTC m=+821.886399875" watchObservedRunningTime="2026-02-15 17:19:25.94902825 +0000 UTC m=+821.892436382" Feb 15 17:19:26 crc kubenswrapper[4585]: I0215 17:19:26.726445 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" event={"ID":"345af412-80a4-4d2b-9738-9ed005847c6a","Type":"ContainerStarted","Data":"b0213a35cd544c7b48f79d5b3144332c2bd596ff7cf839be25d6bcee386675f3"} Feb 15 17:19:26 crc kubenswrapper[4585]: I0215 17:19:26.728346 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" Feb 15 17:19:26 crc kubenswrapper[4585]: I0215 17:19:26.747400 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" podStartSLOduration=3.137543452 podStartE2EDuration="34.747385056s" podCreationTimestamp="2026-02-15 17:18:52 +0000 UTC" firstStartedPulling="2026-02-15 17:18:54.80718925 +0000 UTC m=+790.750597382" lastFinishedPulling="2026-02-15 17:19:26.417030854 +0000 UTC m=+822.360438986" observedRunningTime="2026-02-15 17:19:26.744695711 +0000 UTC m=+822.688103843" watchObservedRunningTime="2026-02-15 17:19:26.747385056 +0000 UTC m=+822.690793188" Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.758025 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" event={"ID":"58ebd0ec-b27f-493b-accb-7c43c2408f19","Type":"ContainerStarted","Data":"95564abe2ba9f3feef2f01e0f36d5d1b49fe0f3af54fee7348b961b2675cb793"} Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.758636 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.765117 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" event={"ID":"4378a5c2-4e4a-422a-9cd5-b55433ac3fbe","Type":"ContainerStarted","Data":"ed7831ae469eb13d8ce72de7c918b896fa956bbc8f6739f8b1224e6cae5d2bd8"} Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.765502 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.767135 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" event={"ID":"abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b","Type":"ContainerStarted","Data":"e586ef4946329c32cec063d8eb38e36fdb24c4d4e757cdcfe1739dd08ad48c25"} Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.767279 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.768630 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" event={"ID":"9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414","Type":"ContainerStarted","Data":"d4c307cc06d4059ee4ea412c3833184c3d9a10334908802db4b8756cf2977778"} Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.768764 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.800348 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" podStartSLOduration=3.244240731 podStartE2EDuration="36.80032969s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.745102913 +0000 UTC m=+791.688511045" lastFinishedPulling="2026-02-15 17:19:29.301191832 +0000 UTC m=+825.244600004" observedRunningTime="2026-02-15 17:19:29.780694846 +0000 UTC m=+825.724102978" watchObservedRunningTime="2026-02-15 17:19:29.80032969 +0000 UTC m=+825.743737822" Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.800912 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" podStartSLOduration=30.285867245 podStartE2EDuration="36.800907646s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:19:22.784965758 +0000 UTC m=+818.728373890" lastFinishedPulling="2026-02-15 17:19:29.300006119 +0000 UTC m=+825.243414291" observedRunningTime="2026-02-15 17:19:29.798064987 +0000 UTC m=+825.741473119" watchObservedRunningTime="2026-02-15 17:19:29.800907646 +0000 UTC m=+825.744315778" Feb 15 17:19:29 crc kubenswrapper[4585]: I0215 17:19:29.824961 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" podStartSLOduration=30.173660167 podStartE2EDuration="36.824946162s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:19:22.648540249 +0000 UTC m=+818.591948381" lastFinishedPulling="2026-02-15 17:19:29.299826244 +0000 UTC m=+825.243234376" observedRunningTime="2026-02-15 17:19:29.821241288 +0000 UTC m=+825.764649420" watchObservedRunningTime="2026-02-15 17:19:29.824946162 +0000 UTC m=+825.768354294" Feb 15 17:19:30 crc kubenswrapper[4585]: I0215 17:19:30.385188 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-66bb5545bf-x7sb4" Feb 15 17:19:30 crc kubenswrapper[4585]: I0215 17:19:30.413357 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" podStartSLOduration=3.362825225 podStartE2EDuration="37.413337331s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.251971042 +0000 UTC m=+791.195379174" lastFinishedPulling="2026-02-15 17:19:29.302483138 +0000 UTC m=+825.245891280" observedRunningTime="2026-02-15 17:19:29.844796902 +0000 UTC m=+825.788205034" watchObservedRunningTime="2026-02-15 17:19:30.413337331 +0000 UTC m=+826.356745473" Feb 15 17:19:32 crc kubenswrapper[4585]: I0215 17:19:32.807958 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" event={"ID":"cb189c3a-a98e-4fc4-b074-ce2e17b2950b","Type":"ContainerStarted","Data":"a21837886fa3636715a5e6e31cbc8d4c048ca5c891bf2159b09b0ad19ac5763e"} Feb 15 17:19:32 crc kubenswrapper[4585]: I0215 17:19:32.808672 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" Feb 15 17:19:32 crc kubenswrapper[4585]: I0215 17:19:32.835977 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" podStartSLOduration=2.790231222 podStartE2EDuration="39.835953582s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.240309849 +0000 UTC m=+791.183717971" lastFinishedPulling="2026-02-15 17:19:32.286032199 +0000 UTC m=+828.229440331" observedRunningTime="2026-02-15 17:19:32.834371099 +0000 UTC m=+828.777779231" watchObservedRunningTime="2026-02-15 17:19:32.835953582 +0000 UTC m=+828.779361744" Feb 15 17:19:33 crc kubenswrapper[4585]: I0215 17:19:33.424927 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vs4dd" Feb 15 17:19:33 crc kubenswrapper[4585]: I0215 17:19:33.638161 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-rr6q7" Feb 15 17:19:33 crc kubenswrapper[4585]: I0215 17:19:33.661578 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-p75lp" Feb 15 17:19:33 crc kubenswrapper[4585]: I0215 17:19:33.726515 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-j9lfr" Feb 15 17:19:33 crc kubenswrapper[4585]: I0215 17:19:33.799877 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-zf88x" Feb 15 17:19:33 crc kubenswrapper[4585]: I0215 17:19:33.863111 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-p8qvp" Feb 15 17:19:33 crc kubenswrapper[4585]: I0215 17:19:33.913917 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-5cqqm" Feb 15 17:19:33 crc kubenswrapper[4585]: I0215 17:19:33.967713 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-f2wrc" Feb 15 17:19:34 crc kubenswrapper[4585]: I0215 17:19:34.037060 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-fvsm5" Feb 15 17:19:34 crc kubenswrapper[4585]: I0215 17:19:34.338212 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-8tzvv" Feb 15 17:19:34 crc kubenswrapper[4585]: I0215 17:19:34.370465 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-7s6mk" Feb 15 17:19:34 crc kubenswrapper[4585]: I0215 17:19:34.438500 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-5dh24" Feb 15 17:19:34 crc kubenswrapper[4585]: I0215 17:19:34.454566 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-rh8kk" Feb 15 17:19:34 crc kubenswrapper[4585]: I0215 17:19:34.473307 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-2l54v" Feb 15 17:19:34 crc kubenswrapper[4585]: I0215 17:19:34.638049 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-vvfcn" Feb 15 17:19:35 crc kubenswrapper[4585]: I0215 17:19:35.834083 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" event={"ID":"d32edc1b-dcb5-4338-801c-fb1657a78892","Type":"ContainerStarted","Data":"5aa2ec935d02e1b7e208570eac4b3142ef710111efd8913253f81a36bd26ac02"} Feb 15 17:19:35 crc kubenswrapper[4585]: I0215 17:19:35.834572 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" Feb 15 17:19:35 crc kubenswrapper[4585]: I0215 17:19:35.867011 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" podStartSLOduration=3.362568357 podStartE2EDuration="42.866983019s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.512806867 +0000 UTC m=+791.456214999" lastFinishedPulling="2026-02-15 17:19:35.017221519 +0000 UTC m=+830.960629661" observedRunningTime="2026-02-15 17:19:35.857300171 +0000 UTC m=+831.800708313" watchObservedRunningTime="2026-02-15 17:19:35.866983019 +0000 UTC m=+831.810391191" Feb 15 17:19:38 crc kubenswrapper[4585]: I0215 17:19:38.888827 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" event={"ID":"afc8afe1-32b1-47ef-9fd4-6331fec926f5","Type":"ContainerStarted","Data":"1d72dde3c993dffbf259312cc46ec1c1a6463163c89e9d624648a5304bf58da6"} Feb 15 17:19:38 crc kubenswrapper[4585]: I0215 17:19:38.890733 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" Feb 15 17:19:38 crc kubenswrapper[4585]: I0215 17:19:38.904472 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" podStartSLOduration=3.377006589 podStartE2EDuration="45.904457696s" podCreationTimestamp="2026-02-15 17:18:53 +0000 UTC" firstStartedPulling="2026-02-15 17:18:55.10028009 +0000 UTC m=+791.043688222" lastFinishedPulling="2026-02-15 17:19:37.627731187 +0000 UTC m=+833.571139329" observedRunningTime="2026-02-15 17:19:38.904453056 +0000 UTC m=+834.847861188" watchObservedRunningTime="2026-02-15 17:19:38.904457696 +0000 UTC m=+834.847865818" Feb 15 17:19:39 crc kubenswrapper[4585]: I0215 17:19:39.479800 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7c4bfc5b96-482zr" Feb 15 17:19:39 crc kubenswrapper[4585]: I0215 17:19:39.887035 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt" Feb 15 17:19:43 crc kubenswrapper[4585]: I0215 17:19:43.913347 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-btbfc" Feb 15 17:19:43 crc kubenswrapper[4585]: I0215 17:19:43.988409 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-ndnrr" Feb 15 17:19:44 crc kubenswrapper[4585]: I0215 17:19:44.027675 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-chqj4" Feb 15 17:19:44 crc kubenswrapper[4585]: I0215 17:19:44.068175 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-ztxlq" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.453346 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gmlr4"] Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.455290 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.457553 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.457823 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.458939 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-hcrpk" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.461659 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.468433 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gmlr4"] Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.528535 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lfp25"] Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.532562 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.537352 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.540306 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4gm\" (UniqueName: \"kubernetes.io/projected/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-kube-api-access-gr4gm\") pod \"dnsmasq-dns-675f4bcbfc-gmlr4\" (UID: \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.540386 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-config\") pod \"dnsmasq-dns-675f4bcbfc-gmlr4\" (UID: \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.575814 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lfp25"] Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.641466 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lfp25\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.641530 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-config\") pod \"dnsmasq-dns-78dd6ddcc-lfp25\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.641557 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-config\") pod \"dnsmasq-dns-675f4bcbfc-gmlr4\" (UID: \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.641616 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tl8g\" (UniqueName: \"kubernetes.io/projected/265e6207-988e-447a-a4ad-1748702286d7-kube-api-access-4tl8g\") pod \"dnsmasq-dns-78dd6ddcc-lfp25\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.641674 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4gm\" (UniqueName: \"kubernetes.io/projected/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-kube-api-access-gr4gm\") pod \"dnsmasq-dns-675f4bcbfc-gmlr4\" (UID: \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.642685 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-config\") pod \"dnsmasq-dns-675f4bcbfc-gmlr4\" (UID: \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.674403 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4gm\" (UniqueName: \"kubernetes.io/projected/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-kube-api-access-gr4gm\") pod \"dnsmasq-dns-675f4bcbfc-gmlr4\" (UID: \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.742544 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-config\") pod \"dnsmasq-dns-78dd6ddcc-lfp25\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.742660 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tl8g\" (UniqueName: \"kubernetes.io/projected/265e6207-988e-447a-a4ad-1748702286d7-kube-api-access-4tl8g\") pod \"dnsmasq-dns-78dd6ddcc-lfp25\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.742734 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lfp25\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.743533 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-config\") pod \"dnsmasq-dns-78dd6ddcc-lfp25\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.743547 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lfp25\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.760833 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tl8g\" (UniqueName: \"kubernetes.io/projected/265e6207-988e-447a-a4ad-1748702286d7-kube-api-access-4tl8g\") pod \"dnsmasq-dns-78dd6ddcc-lfp25\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.777429 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:00 crc kubenswrapper[4585]: I0215 17:20:00.869668 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:01 crc kubenswrapper[4585]: I0215 17:20:01.336661 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gmlr4"] Feb 15 17:20:01 crc kubenswrapper[4585]: W0215 17:20:01.342156 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2325dc0d_1b7b_4d8b_9bba_e312258b73a7.slice/crio-f0d6fc8d238a1bd4ec92593e0b43bb8d84efd854558c9d094408354fd163ca89 WatchSource:0}: Error finding container f0d6fc8d238a1bd4ec92593e0b43bb8d84efd854558c9d094408354fd163ca89: Status 404 returned error can't find the container with id f0d6fc8d238a1bd4ec92593e0b43bb8d84efd854558c9d094408354fd163ca89 Feb 15 17:20:01 crc kubenswrapper[4585]: I0215 17:20:01.344670 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 15 17:20:01 crc kubenswrapper[4585]: I0215 17:20:01.390275 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lfp25"] Feb 15 17:20:01 crc kubenswrapper[4585]: W0215 17:20:01.400833 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod265e6207_988e_447a_a4ad_1748702286d7.slice/crio-aa00a00accd6744a160dafb3112ed4d780099953cc8998e89fa79a26fe81d1d4 WatchSource:0}: Error finding container aa00a00accd6744a160dafb3112ed4d780099953cc8998e89fa79a26fe81d1d4: Status 404 returned error can't find the container with id aa00a00accd6744a160dafb3112ed4d780099953cc8998e89fa79a26fe81d1d4 Feb 15 17:20:02 crc kubenswrapper[4585]: I0215 17:20:02.137302 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" event={"ID":"265e6207-988e-447a-a4ad-1748702286d7","Type":"ContainerStarted","Data":"aa00a00accd6744a160dafb3112ed4d780099953cc8998e89fa79a26fe81d1d4"} Feb 15 17:20:02 crc kubenswrapper[4585]: I0215 17:20:02.140331 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" event={"ID":"2325dc0d-1b7b-4d8b-9bba-e312258b73a7","Type":"ContainerStarted","Data":"f0d6fc8d238a1bd4ec92593e0b43bb8d84efd854558c9d094408354fd163ca89"} Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.169313 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gmlr4"] Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.198068 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbbjr"] Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.199456 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.211218 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbbjr"] Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.302525 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jbbjr\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.302610 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7nn\" (UniqueName: \"kubernetes.io/projected/e74925c2-940d-48c3-853b-a0664af9b31d-kube-api-access-dw7nn\") pod \"dnsmasq-dns-666b6646f7-jbbjr\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.302649 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-config\") pod \"dnsmasq-dns-666b6646f7-jbbjr\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.404339 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7nn\" (UniqueName: \"kubernetes.io/projected/e74925c2-940d-48c3-853b-a0664af9b31d-kube-api-access-dw7nn\") pod \"dnsmasq-dns-666b6646f7-jbbjr\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.404383 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-config\") pod \"dnsmasq-dns-666b6646f7-jbbjr\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.404483 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jbbjr\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.405246 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jbbjr\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.405770 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-config\") pod \"dnsmasq-dns-666b6646f7-jbbjr\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.462745 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7nn\" (UniqueName: \"kubernetes.io/projected/e74925c2-940d-48c3-853b-a0664af9b31d-kube-api-access-dw7nn\") pod \"dnsmasq-dns-666b6646f7-jbbjr\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.523726 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.800113 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lfp25"] Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.882653 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4bfs4"] Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.883979 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:03 crc kubenswrapper[4585]: I0215 17:20:03.919168 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4bfs4"] Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.053846 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4bfs4\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.053917 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-config\") pod \"dnsmasq-dns-57d769cc4f-4bfs4\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.053939 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkx5c\" (UniqueName: \"kubernetes.io/projected/22360463-bc5c-4e70-abbe-710820b7344b-kube-api-access-tkx5c\") pod \"dnsmasq-dns-57d769cc4f-4bfs4\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.155437 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4bfs4\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.155826 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-config\") pod \"dnsmasq-dns-57d769cc4f-4bfs4\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.155857 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkx5c\" (UniqueName: \"kubernetes.io/projected/22360463-bc5c-4e70-abbe-710820b7344b-kube-api-access-tkx5c\") pod \"dnsmasq-dns-57d769cc4f-4bfs4\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.157807 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4bfs4\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.159807 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-config\") pod \"dnsmasq-dns-57d769cc4f-4bfs4\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.173506 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkx5c\" (UniqueName: \"kubernetes.io/projected/22360463-bc5c-4e70-abbe-710820b7344b-kube-api-access-tkx5c\") pod \"dnsmasq-dns-57d769cc4f-4bfs4\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.207681 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.519636 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.521563 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.531273 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.531448 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.531562 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.531678 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.531775 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8xhsl" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.532477 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.532578 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.553941 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbbjr"] Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.567578 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.684649 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.684926 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.684969 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9de65e3c-3874-4fc0-9566-84138bb228b7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.684988 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn59p\" (UniqueName: \"kubernetes.io/projected/9de65e3c-3874-4fc0-9566-84138bb228b7-kube-api-access-qn59p\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.685018 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9de65e3c-3874-4fc0-9566-84138bb228b7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.685129 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.685279 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9de65e3c-3874-4fc0-9566-84138bb228b7-config-data\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.685447 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9de65e3c-3874-4fc0-9566-84138bb228b7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.685522 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.685592 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.685716 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9de65e3c-3874-4fc0-9566-84138bb228b7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.786672 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9de65e3c-3874-4fc0-9566-84138bb228b7-config-data\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.786772 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9de65e3c-3874-4fc0-9566-84138bb228b7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.786797 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.787657 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9de65e3c-3874-4fc0-9566-84138bb228b7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.787699 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.787735 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9de65e3c-3874-4fc0-9566-84138bb228b7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.787755 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.787777 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.787809 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9de65e3c-3874-4fc0-9566-84138bb228b7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.787828 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn59p\" (UniqueName: \"kubernetes.io/projected/9de65e3c-3874-4fc0-9566-84138bb228b7-kube-api-access-qn59p\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.787852 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9de65e3c-3874-4fc0-9566-84138bb228b7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.787884 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.788188 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.788409 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.790339 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.791311 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9de65e3c-3874-4fc0-9566-84138bb228b7-config-data\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.792489 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.792888 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9de65e3c-3874-4fc0-9566-84138bb228b7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.795657 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9de65e3c-3874-4fc0-9566-84138bb228b7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.795946 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9de65e3c-3874-4fc0-9566-84138bb228b7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.806985 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9de65e3c-3874-4fc0-9566-84138bb228b7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.816255 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn59p\" (UniqueName: \"kubernetes.io/projected/9de65e3c-3874-4fc0-9566-84138bb228b7-kube-api-access-qn59p\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.840902 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9de65e3c-3874-4fc0-9566-84138bb228b7\") " pod="openstack/rabbitmq-server-0" Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.867188 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4bfs4"] Feb 15 17:20:04 crc kubenswrapper[4585]: I0215 17:20:04.892419 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.187520 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" event={"ID":"22360463-bc5c-4e70-abbe-710820b7344b","Type":"ContainerStarted","Data":"845e0eccadb98a41ae7c7c61cc2f946ce6e014dfe3646021f983665fe97c6d18"} Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.189996 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" event={"ID":"e74925c2-940d-48c3-853b-a0664af9b31d","Type":"ContainerStarted","Data":"c6c32d025092318060cb7a5d000a10a7454b6fae75911672be7661a93373ce9b"} Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.366817 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 15 17:20:05 crc kubenswrapper[4585]: W0215 17:20:05.376159 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de65e3c_3874_4fc0_9566_84138bb228b7.slice/crio-a5c882c080730d7f7d3097315f5fd72b1cca01ccc51f44a2f1bd36de1a051ca2 WatchSource:0}: Error finding container a5c882c080730d7f7d3097315f5fd72b1cca01ccc51f44a2f1bd36de1a051ca2: Status 404 returned error can't find the container with id a5c882c080730d7f7d3097315f5fd72b1cca01ccc51f44a2f1bd36de1a051ca2 Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.791084 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.792483 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.794759 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.795431 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.795555 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.795681 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-xp9qf" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.813409 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.825559 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.911628 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.911675 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-config-data-default\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.911716 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwln\" (UniqueName: \"kubernetes.io/projected/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-kube-api-access-8wwln\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.911737 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.912051 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.912152 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.912206 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-kolla-config\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:05 crc kubenswrapper[4585]: I0215 17:20:05.912236 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.014721 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.014781 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.014809 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-kolla-config\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.014831 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.014891 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.014921 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-config-data-default\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.014951 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwln\" (UniqueName: \"kubernetes.io/projected/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-kube-api-access-8wwln\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.014969 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.015245 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.017406 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.020308 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-config-data-default\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.026940 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.027258 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-kolla-config\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.032441 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.040733 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.043497 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwln\" (UniqueName: \"kubernetes.io/projected/0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c-kube-api-access-8wwln\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.099979 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c\") " pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.117831 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.229654 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9de65e3c-3874-4fc0-9566-84138bb228b7","Type":"ContainerStarted","Data":"a5c882c080730d7f7d3097315f5fd72b1cca01ccc51f44a2f1bd36de1a051ca2"} Feb 15 17:20:06 crc kubenswrapper[4585]: I0215 17:20:06.751528 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.132773 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.134376 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.137150 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.137888 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-b4fzc" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.138022 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.138124 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.151333 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.241535 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.241839 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/858caedb-5be5-49ce-b806-489e3c0531b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.241895 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/858caedb-5be5-49ce-b806-489e3c0531b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.241942 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/858caedb-5be5-49ce-b806-489e3c0531b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.242054 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858caedb-5be5-49ce-b806-489e3c0531b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.242140 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4g46\" (UniqueName: \"kubernetes.io/projected/858caedb-5be5-49ce-b806-489e3c0531b5-kube-api-access-n4g46\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.242162 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/858caedb-5be5-49ce-b806-489e3c0531b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.242365 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/858caedb-5be5-49ce-b806-489e3c0531b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.246064 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c","Type":"ContainerStarted","Data":"155b8cd5a8cc291b476d66c5bd067244ea612f766b1d847b5c47b80064c3601c"} Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.344411 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/858caedb-5be5-49ce-b806-489e3c0531b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.344464 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/858caedb-5be5-49ce-b806-489e3c0531b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.344491 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/858caedb-5be5-49ce-b806-489e3c0531b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.344509 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858caedb-5be5-49ce-b806-489e3c0531b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.344536 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/858caedb-5be5-49ce-b806-489e3c0531b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.344551 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4g46\" (UniqueName: \"kubernetes.io/projected/858caedb-5be5-49ce-b806-489e3c0531b5-kube-api-access-n4g46\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.344608 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/858caedb-5be5-49ce-b806-489e3c0531b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.344654 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.344913 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.347466 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/858caedb-5be5-49ce-b806-489e3c0531b5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.347725 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/858caedb-5be5-49ce-b806-489e3c0531b5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.348238 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/858caedb-5be5-49ce-b806-489e3c0531b5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.348643 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/858caedb-5be5-49ce-b806-489e3c0531b5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.352443 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858caedb-5be5-49ce-b806-489e3c0531b5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.358178 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/858caedb-5be5-49ce-b806-489e3c0531b5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.366997 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4g46\" (UniqueName: \"kubernetes.io/projected/858caedb-5be5-49ce-b806-489e3c0531b5-kube-api-access-n4g46\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.376831 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"858caedb-5be5-49ce-b806-489e3c0531b5\") " pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.481128 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.492949 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.494754 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.496843 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9jkzx" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.498103 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.498253 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.510071 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.684879 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b5mm\" (UniqueName: \"kubernetes.io/projected/66d1b09a-b816-4f0f-be98-6963462597ab-kube-api-access-8b5mm\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.684931 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d1b09a-b816-4f0f-be98-6963462597ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.685035 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66d1b09a-b816-4f0f-be98-6963462597ab-kolla-config\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.685094 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1b09a-b816-4f0f-be98-6963462597ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:07 crc kubenswrapper[4585]: I0215 17:20:07.685118 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66d1b09a-b816-4f0f-be98-6963462597ab-config-data\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.793667 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b5mm\" (UniqueName: \"kubernetes.io/projected/66d1b09a-b816-4f0f-be98-6963462597ab-kube-api-access-8b5mm\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.794063 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d1b09a-b816-4f0f-be98-6963462597ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.794180 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66d1b09a-b816-4f0f-be98-6963462597ab-kolla-config\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.794234 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1b09a-b816-4f0f-be98-6963462597ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.794277 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66d1b09a-b816-4f0f-be98-6963462597ab-config-data\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.795735 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66d1b09a-b816-4f0f-be98-6963462597ab-config-data\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.799802 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66d1b09a-b816-4f0f-be98-6963462597ab-kolla-config\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.806104 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d1b09a-b816-4f0f-be98-6963462597ab-combined-ca-bundle\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.806538 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d1b09a-b816-4f0f-be98-6963462597ab-memcached-tls-certs\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.813939 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b5mm\" (UniqueName: \"kubernetes.io/projected/66d1b09a-b816-4f0f-be98-6963462597ab-kube-api-access-8b5mm\") pod \"memcached-0\" (UID: \"66d1b09a-b816-4f0f-be98-6963462597ab\") " pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:07.915331 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 15 17:20:08 crc kubenswrapper[4585]: I0215 17:20:08.784301 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 15 17:20:09 crc kubenswrapper[4585]: I0215 17:20:09.730228 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 15 17:20:09 crc kubenswrapper[4585]: I0215 17:20:09.732036 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 15 17:20:09 crc kubenswrapper[4585]: I0215 17:20:09.733801 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-qvbnt" Feb 15 17:20:09 crc kubenswrapper[4585]: I0215 17:20:09.749553 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 15 17:20:09 crc kubenswrapper[4585]: I0215 17:20:09.827818 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddwsn\" (UniqueName: \"kubernetes.io/projected/a02ef963-4b80-427c-9f23-c033a729b944-kube-api-access-ddwsn\") pod \"kube-state-metrics-0\" (UID: \"a02ef963-4b80-427c-9f23-c033a729b944\") " pod="openstack/kube-state-metrics-0" Feb 15 17:20:09 crc kubenswrapper[4585]: I0215 17:20:09.929212 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddwsn\" (UniqueName: \"kubernetes.io/projected/a02ef963-4b80-427c-9f23-c033a729b944-kube-api-access-ddwsn\") pod \"kube-state-metrics-0\" (UID: \"a02ef963-4b80-427c-9f23-c033a729b944\") " pod="openstack/kube-state-metrics-0" Feb 15 17:20:09 crc kubenswrapper[4585]: I0215 17:20:09.959529 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddwsn\" (UniqueName: \"kubernetes.io/projected/a02ef963-4b80-427c-9f23-c033a729b944-kube-api-access-ddwsn\") pod \"kube-state-metrics-0\" (UID: \"a02ef963-4b80-427c-9f23-c033a729b944\") " pod="openstack/kube-state-metrics-0" Feb 15 17:20:10 crc kubenswrapper[4585]: I0215 17:20:10.110941 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.108779 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-87pkc"] Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.110989 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.118309 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-tsz7f" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.118541 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.125469 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.128005 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-87pkc"] Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.139616 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hzct7"] Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.141779 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.170816 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hzct7"] Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.197440 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-var-run\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.197523 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-var-log-ovn\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.197544 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwjr9\" (UniqueName: \"kubernetes.io/projected/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-kube-api-access-mwjr9\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.197720 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-var-run-ovn\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.197825 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-scripts\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.197874 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-combined-ca-bundle\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.198024 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-ovn-controller-tls-certs\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299120 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-var-log-ovn\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299169 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwjr9\" (UniqueName: \"kubernetes.io/projected/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-kube-api-access-mwjr9\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299199 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-var-log\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299242 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-var-run-ovn\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299267 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-var-lib\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299287 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-etc-ovs\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299303 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-scripts\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299330 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-combined-ca-bundle\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299382 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50cd83ac-87a5-46e8-be00-9b8cf954efe0-scripts\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299405 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-ovn-controller-tls-certs\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299427 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-var-run\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299442 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-var-run\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299467 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47l9z\" (UniqueName: \"kubernetes.io/projected/50cd83ac-87a5-46e8-be00-9b8cf954efe0-kube-api-access-47l9z\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.299961 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-var-log-ovn\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.300393 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-var-run-ovn\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.302367 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-scripts\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.304169 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-var-run\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.308420 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-ovn-controller-tls-certs\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.310098 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-combined-ca-bundle\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.317219 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwjr9\" (UniqueName: \"kubernetes.io/projected/2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9-kube-api-access-mwjr9\") pod \"ovn-controller-87pkc\" (UID: \"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9\") " pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.395145 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.396685 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.400880 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-var-log\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.400949 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-var-lib\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.400969 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-etc-ovs\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.401035 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50cd83ac-87a5-46e8-be00-9b8cf954efe0-scripts\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.401061 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-var-run\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.401080 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47l9z\" (UniqueName: \"kubernetes.io/projected/50cd83ac-87a5-46e8-be00-9b8cf954efe0-kube-api-access-47l9z\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.401463 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-var-log\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.401566 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-var-lib\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.401676 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-etc-ovs\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.402390 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/50cd83ac-87a5-46e8-be00-9b8cf954efe0-var-run\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.403281 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/50cd83ac-87a5-46e8-be00-9b8cf954efe0-scripts\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.404032 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.404159 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.404083 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-4n445" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.404275 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.404794 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.432657 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.437935 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-87pkc" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.443315 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47l9z\" (UniqueName: \"kubernetes.io/projected/50cd83ac-87a5-46e8-be00-9b8cf954efe0-kube-api-access-47l9z\") pod \"ovn-controller-ovs-hzct7\" (UID: \"50cd83ac-87a5-46e8-be00-9b8cf954efe0\") " pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.472395 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.502878 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba94e4cb-032f-451a-afd7-f908bef47709-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.503147 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba94e4cb-032f-451a-afd7-f908bef47709-config\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.503222 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba94e4cb-032f-451a-afd7-f908bef47709-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.503362 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba94e4cb-032f-451a-afd7-f908bef47709-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.503450 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.503526 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba94e4cb-032f-451a-afd7-f908bef47709-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.503568 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba94e4cb-032f-451a-afd7-f908bef47709-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.503770 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbk44\" (UniqueName: \"kubernetes.io/projected/ba94e4cb-032f-451a-afd7-f908bef47709-kube-api-access-fbk44\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605175 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba94e4cb-032f-451a-afd7-f908bef47709-config\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605224 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba94e4cb-032f-451a-afd7-f908bef47709-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605270 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba94e4cb-032f-451a-afd7-f908bef47709-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605300 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605332 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba94e4cb-032f-451a-afd7-f908bef47709-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605355 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba94e4cb-032f-451a-afd7-f908bef47709-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605393 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbk44\" (UniqueName: \"kubernetes.io/projected/ba94e4cb-032f-451a-afd7-f908bef47709-kube-api-access-fbk44\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605442 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba94e4cb-032f-451a-afd7-f908bef47709-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605641 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.605791 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba94e4cb-032f-451a-afd7-f908bef47709-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.606126 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba94e4cb-032f-451a-afd7-f908bef47709-config\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.606885 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba94e4cb-032f-451a-afd7-f908bef47709-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.620228 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba94e4cb-032f-451a-afd7-f908bef47709-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.625297 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba94e4cb-032f-451a-afd7-f908bef47709-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.628038 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbk44\" (UniqueName: \"kubernetes.io/projected/ba94e4cb-032f-451a-afd7-f908bef47709-kube-api-access-fbk44\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.633314 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba94e4cb-032f-451a-afd7-f908bef47709-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.656896 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ba94e4cb-032f-451a-afd7-f908bef47709\") " pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:13 crc kubenswrapper[4585]: I0215 17:20:13.720841 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:15 crc kubenswrapper[4585]: W0215 17:20:15.896286 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod858caedb_5be5_49ce_b806_489e3c0531b5.slice/crio-7f8ae2c10b340e0e28707deaeecdb469424f2da05a8723da20dddd4cb35e7bf1 WatchSource:0}: Error finding container 7f8ae2c10b340e0e28707deaeecdb469424f2da05a8723da20dddd4cb35e7bf1: Status 404 returned error can't find the container with id 7f8ae2c10b340e0e28707deaeecdb469424f2da05a8723da20dddd4cb35e7bf1 Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.436387 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"858caedb-5be5-49ce-b806-489e3c0531b5","Type":"ContainerStarted","Data":"7f8ae2c10b340e0e28707deaeecdb469424f2da05a8723da20dddd4cb35e7bf1"} Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.497687 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.499323 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.502003 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.528886 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-56p2z" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.529540 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.529581 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.529547 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.673387 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e892e065-8113-4612-a7f6-808490c8b000-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.673426 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e892e065-8113-4612-a7f6-808490c8b000-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.673471 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hbbd\" (UniqueName: \"kubernetes.io/projected/e892e065-8113-4612-a7f6-808490c8b000-kube-api-access-9hbbd\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.673496 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e892e065-8113-4612-a7f6-808490c8b000-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.673776 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.673816 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e892e065-8113-4612-a7f6-808490c8b000-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.674688 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e892e065-8113-4612-a7f6-808490c8b000-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.674720 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e892e065-8113-4612-a7f6-808490c8b000-config\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.775961 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.775998 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e892e065-8113-4612-a7f6-808490c8b000-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.776030 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e892e065-8113-4612-a7f6-808490c8b000-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.776049 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e892e065-8113-4612-a7f6-808490c8b000-config\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.776103 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e892e065-8113-4612-a7f6-808490c8b000-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.776120 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e892e065-8113-4612-a7f6-808490c8b000-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.776154 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hbbd\" (UniqueName: \"kubernetes.io/projected/e892e065-8113-4612-a7f6-808490c8b000-kube-api-access-9hbbd\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.776176 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e892e065-8113-4612-a7f6-808490c8b000-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.776267 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.777450 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e892e065-8113-4612-a7f6-808490c8b000-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.777804 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e892e065-8113-4612-a7f6-808490c8b000-config\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.778645 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e892e065-8113-4612-a7f6-808490c8b000-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.789276 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e892e065-8113-4612-a7f6-808490c8b000-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.789317 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e892e065-8113-4612-a7f6-808490c8b000-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.793832 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hbbd\" (UniqueName: \"kubernetes.io/projected/e892e065-8113-4612-a7f6-808490c8b000-kube-api-access-9hbbd\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.802088 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e892e065-8113-4612-a7f6-808490c8b000-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.805509 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e892e065-8113-4612-a7f6-808490c8b000\") " pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:16 crc kubenswrapper[4585]: I0215 17:20:16.845667 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:17 crc kubenswrapper[4585]: I0215 17:20:17.014054 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:20:17 crc kubenswrapper[4585]: I0215 17:20:17.014107 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.470115 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8jr8n"] Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.473115 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.477166 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jr8n"] Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.570165 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-utilities\") pod \"community-operators-8jr8n\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.570214 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-catalog-content\") pod \"community-operators-8jr8n\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.570297 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpgvm\" (UniqueName: \"kubernetes.io/projected/0e2147ab-2614-4b64-9f18-a45bec1f4937-kube-api-access-fpgvm\") pod \"community-operators-8jr8n\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.671741 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-utilities\") pod \"community-operators-8jr8n\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.671801 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-catalog-content\") pod \"community-operators-8jr8n\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.671891 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpgvm\" (UniqueName: \"kubernetes.io/projected/0e2147ab-2614-4b64-9f18-a45bec1f4937-kube-api-access-fpgvm\") pod \"community-operators-8jr8n\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.672515 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-catalog-content\") pod \"community-operators-8jr8n\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.675734 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-utilities\") pod \"community-operators-8jr8n\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.688454 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpgvm\" (UniqueName: \"kubernetes.io/projected/0e2147ab-2614-4b64-9f18-a45bec1f4937-kube-api-access-fpgvm\") pod \"community-operators-8jr8n\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:20 crc kubenswrapper[4585]: I0215 17:20:20.799449 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:23 crc kubenswrapper[4585]: I0215 17:20:23.820688 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.279104 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.279698 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dw7nn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-jbbjr_openstack(e74925c2-940d-48c3-853b-a0664af9b31d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.280933 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" podUID="e74925c2-940d-48c3-853b-a0664af9b31d" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.281249 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.281324 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tl8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-lfp25_openstack(265e6207-988e-447a-a4ad-1748702286d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.282612 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" podUID="265e6207-988e-447a-a4ad-1748702286d7" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.287865 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.288017 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gr4gm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-gmlr4_openstack(2325dc0d-1b7b-4d8b-9bba-e312258b73a7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.292004 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" podUID="2325dc0d-1b7b-4d8b-9bba-e312258b73a7" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.320125 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.320320 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkx5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-4bfs4_openstack(22360463-bc5c-4e70-abbe-710820b7344b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.321864 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" podUID="22360463-bc5c-4e70-abbe-710820b7344b" Feb 15 17:20:28 crc kubenswrapper[4585]: I0215 17:20:28.584972 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"66d1b09a-b816-4f0f-be98-6963462597ab","Type":"ContainerStarted","Data":"6f3ca8398f7b2ecdbeda535bfcfb88eeaf9125ad07f829790782c446818004af"} Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.587175 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" podUID="22360463-bc5c-4e70-abbe-710820b7344b" Feb 15 17:20:28 crc kubenswrapper[4585]: E0215 17:20:28.588794 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" podUID="e74925c2-940d-48c3-853b-a0664af9b31d" Feb 15 17:20:30 crc kubenswrapper[4585]: E0215 17:20:30.370027 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Feb 15 17:20:30 crc kubenswrapper[4585]: E0215 17:20:30.372501 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qn59p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(9de65e3c-3874-4fc0-9566-84138bb228b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:20:30 crc kubenswrapper[4585]: E0215 17:20:30.373981 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="9de65e3c-3874-4fc0-9566-84138bb228b7" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.584941 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.611783 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.652380 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" event={"ID":"265e6207-988e-447a-a4ad-1748702286d7","Type":"ContainerDied","Data":"aa00a00accd6744a160dafb3112ed4d780099953cc8998e89fa79a26fe81d1d4"} Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.652458 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lfp25" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.678810 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.679254 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-gmlr4" event={"ID":"2325dc0d-1b7b-4d8b-9bba-e312258b73a7","Type":"ContainerDied","Data":"f0d6fc8d238a1bd4ec92593e0b43bb8d84efd854558c9d094408354fd163ca89"} Feb 15 17:20:30 crc kubenswrapper[4585]: E0215 17:20:30.683036 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-server-0" podUID="9de65e3c-3874-4fc0-9566-84138bb228b7" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.697182 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr4gm\" (UniqueName: \"kubernetes.io/projected/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-kube-api-access-gr4gm\") pod \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\" (UID: \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\") " Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.697500 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-config\") pod \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\" (UID: \"2325dc0d-1b7b-4d8b-9bba-e312258b73a7\") " Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.706861 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-kube-api-access-gr4gm" (OuterVolumeSpecName: "kube-api-access-gr4gm") pod "2325dc0d-1b7b-4d8b-9bba-e312258b73a7" (UID: "2325dc0d-1b7b-4d8b-9bba-e312258b73a7"). InnerVolumeSpecName "kube-api-access-gr4gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.707339 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-config" (OuterVolumeSpecName: "config") pod "2325dc0d-1b7b-4d8b-9bba-e312258b73a7" (UID: "2325dc0d-1b7b-4d8b-9bba-e312258b73a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.813714 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tl8g\" (UniqueName: \"kubernetes.io/projected/265e6207-988e-447a-a4ad-1748702286d7-kube-api-access-4tl8g\") pod \"265e6207-988e-447a-a4ad-1748702286d7\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.822486 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-dns-svc\") pod \"265e6207-988e-447a-a4ad-1748702286d7\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.822567 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-config\") pod \"265e6207-988e-447a-a4ad-1748702286d7\" (UID: \"265e6207-988e-447a-a4ad-1748702286d7\") " Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.824756 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265e6207-988e-447a-a4ad-1748702286d7-kube-api-access-4tl8g" (OuterVolumeSpecName: "kube-api-access-4tl8g") pod "265e6207-988e-447a-a4ad-1748702286d7" (UID: "265e6207-988e-447a-a4ad-1748702286d7"). InnerVolumeSpecName "kube-api-access-4tl8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.824859 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "265e6207-988e-447a-a4ad-1748702286d7" (UID: "265e6207-988e-447a-a4ad-1748702286d7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.825097 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-config" (OuterVolumeSpecName: "config") pod "265e6207-988e-447a-a4ad-1748702286d7" (UID: "265e6207-988e-447a-a4ad-1748702286d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.826674 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.826691 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.826700 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr4gm\" (UniqueName: \"kubernetes.io/projected/2325dc0d-1b7b-4d8b-9bba-e312258b73a7-kube-api-access-gr4gm\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.826708 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tl8g\" (UniqueName: \"kubernetes.io/projected/265e6207-988e-447a-a4ad-1748702286d7-kube-api-access-4tl8g\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.826716 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/265e6207-988e-447a-a4ad-1748702286d7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.944278 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hklp2"] Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.960313 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:30 crc kubenswrapper[4585]: I0215 17:20:30.968970 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hklp2"] Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.041461 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzfq2\" (UniqueName: \"kubernetes.io/projected/89311b3a-05ab-45c7-a6df-e8f671bdab4e-kube-api-access-fzfq2\") pod \"redhat-marketplace-hklp2\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.041637 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-catalog-content\") pod \"redhat-marketplace-hklp2\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.041688 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-utilities\") pod \"redhat-marketplace-hklp2\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.143949 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-utilities\") pod \"redhat-marketplace-hklp2\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.144041 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzfq2\" (UniqueName: \"kubernetes.io/projected/89311b3a-05ab-45c7-a6df-e8f671bdab4e-kube-api-access-fzfq2\") pod \"redhat-marketplace-hklp2\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.144111 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-catalog-content\") pod \"redhat-marketplace-hklp2\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.144483 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-catalog-content\") pod \"redhat-marketplace-hklp2\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.145842 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-utilities\") pod \"redhat-marketplace-hklp2\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.177517 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzfq2\" (UniqueName: \"kubernetes.io/projected/89311b3a-05ab-45c7-a6df-e8f671bdab4e-kube-api-access-fzfq2\") pod \"redhat-marketplace-hklp2\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.218486 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gmlr4"] Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.228237 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gmlr4"] Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.262530 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lfp25"] Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.270300 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lfp25"] Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.290083 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.343731 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.480264 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-87pkc"] Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.501007 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jr8n"] Feb 15 17:20:31 crc kubenswrapper[4585]: W0215 17:20:31.539357 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e2147ab_2614_4b64_9f18_a45bec1f4937.slice/crio-6e8fb597f958a483e669d8652e9fdc5ef9c4f71c4095377845b722cf75306620 WatchSource:0}: Error finding container 6e8fb597f958a483e669d8652e9fdc5ef9c4f71c4095377845b722cf75306620: Status 404 returned error can't find the container with id 6e8fb597f958a483e669d8652e9fdc5ef9c4f71c4095377845b722cf75306620 Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.710798 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jr8n" event={"ID":"0e2147ab-2614-4b64-9f18-a45bec1f4937","Type":"ContainerStarted","Data":"6e8fb597f958a483e669d8652e9fdc5ef9c4f71c4095377845b722cf75306620"} Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.716146 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"858caedb-5be5-49ce-b806-489e3c0531b5","Type":"ContainerStarted","Data":"a10e83f1539bbd633bf5376b4f8a2699d69633fdd9872d1762e22eadc152c48e"} Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.718360 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-87pkc" event={"ID":"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9","Type":"ContainerStarted","Data":"3d9a0803adc855f6c1e4ca5d1122880125c8ebfcf5c08cec3a999035e9a3a227"} Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.722614 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a02ef963-4b80-427c-9f23-c033a729b944","Type":"ContainerStarted","Data":"36023e5f528efdbbcbb2fb52bb7be7343c5672eb1006890e5a2819332e385de1"} Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.727837 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c","Type":"ContainerStarted","Data":"0fad65f702d64dd072233cd688c03f68b2cc997283508f551f832d66355a20b3"} Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.959292 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hzct7"] Feb 15 17:20:31 crc kubenswrapper[4585]: I0215 17:20:31.966805 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hklp2"] Feb 15 17:20:32 crc kubenswrapper[4585]: W0215 17:20:32.023854 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89311b3a_05ab_45c7_a6df_e8f671bdab4e.slice/crio-2335b94b2d64cf0ba289b1e6ef597a6aa7d10f0303163d0f7df47c4aa978f412 WatchSource:0}: Error finding container 2335b94b2d64cf0ba289b1e6ef597a6aa7d10f0303163d0f7df47c4aa978f412: Status 404 returned error can't find the container with id 2335b94b2d64cf0ba289b1e6ef597a6aa7d10f0303163d0f7df47c4aa978f412 Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.511252 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 15 17:20:32 crc kubenswrapper[4585]: W0215 17:20:32.659366 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode892e065_8113_4612_a7f6_808490c8b000.slice/crio-d584b3dd73146619c74d56c2646bfea39a12aac66ee4fccfd6972f1a6203ff27 WatchSource:0}: Error finding container d584b3dd73146619c74d56c2646bfea39a12aac66ee4fccfd6972f1a6203ff27: Status 404 returned error can't find the container with id d584b3dd73146619c74d56c2646bfea39a12aac66ee4fccfd6972f1a6203ff27 Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.745149 4585 generic.go:334] "Generic (PLEG): container finished" podID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerID="67a34be8280d6d2510a13be9fdc5422b73bec7c2221a31f9c318d093148d9b13" exitCode=0 Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.745475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jr8n" event={"ID":"0e2147ab-2614-4b64-9f18-a45bec1f4937","Type":"ContainerDied","Data":"67a34be8280d6d2510a13be9fdc5422b73bec7c2221a31f9c318d093148d9b13"} Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.752735 4585 generic.go:334] "Generic (PLEG): container finished" podID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerID="87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5" exitCode=0 Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.752792 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hklp2" event={"ID":"89311b3a-05ab-45c7-a6df-e8f671bdab4e","Type":"ContainerDied","Data":"87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5"} Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.752809 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hklp2" event={"ID":"89311b3a-05ab-45c7-a6df-e8f671bdab4e","Type":"ContainerStarted","Data":"2335b94b2d64cf0ba289b1e6ef597a6aa7d10f0303163d0f7df47c4aa978f412"} Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.757039 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hzct7" event={"ID":"50cd83ac-87a5-46e8-be00-9b8cf954efe0","Type":"ContainerStarted","Data":"9b21c36332158c7befd4b3d03aff96cdd9a8122f6e499b536de6f7ea2ee2e242"} Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.761129 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e892e065-8113-4612-a7f6-808490c8b000","Type":"ContainerStarted","Data":"d584b3dd73146619c74d56c2646bfea39a12aac66ee4fccfd6972f1a6203ff27"} Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.813637 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.868741 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2325dc0d-1b7b-4d8b-9bba-e312258b73a7" path="/var/lib/kubelet/pods/2325dc0d-1b7b-4d8b-9bba-e312258b73a7/volumes" Feb 15 17:20:32 crc kubenswrapper[4585]: I0215 17:20:32.869108 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265e6207-988e-447a-a4ad-1748702286d7" path="/var/lib/kubelet/pods/265e6207-988e-447a-a4ad-1748702286d7/volumes" Feb 15 17:20:33 crc kubenswrapper[4585]: I0215 17:20:33.774465 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ba94e4cb-032f-451a-afd7-f908bef47709","Type":"ContainerStarted","Data":"a5a88c5692ff1b422f6dfe192312cfe3b0303ec0ccecbfb8ae1468b36f72fd50"} Feb 15 17:20:35 crc kubenswrapper[4585]: I0215 17:20:35.789751 4585 generic.go:334] "Generic (PLEG): container finished" podID="0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c" containerID="0fad65f702d64dd072233cd688c03f68b2cc997283508f551f832d66355a20b3" exitCode=0 Feb 15 17:20:35 crc kubenswrapper[4585]: I0215 17:20:35.789851 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c","Type":"ContainerDied","Data":"0fad65f702d64dd072233cd688c03f68b2cc997283508f551f832d66355a20b3"} Feb 15 17:20:35 crc kubenswrapper[4585]: I0215 17:20:35.793300 4585 generic.go:334] "Generic (PLEG): container finished" podID="858caedb-5be5-49ce-b806-489e3c0531b5" containerID="a10e83f1539bbd633bf5376b4f8a2699d69633fdd9872d1762e22eadc152c48e" exitCode=0 Feb 15 17:20:35 crc kubenswrapper[4585]: I0215 17:20:35.793344 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"858caedb-5be5-49ce-b806-489e3c0531b5","Type":"ContainerDied","Data":"a10e83f1539bbd633bf5376b4f8a2699d69633fdd9872d1762e22eadc152c48e"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.822326 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e892e065-8113-4612-a7f6-808490c8b000","Type":"ContainerStarted","Data":"a188294634a982e6f05b8f44b86e3c962351899b0c92f3088cd327836725d022"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.823841 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ba94e4cb-032f-451a-afd7-f908bef47709","Type":"ContainerStarted","Data":"eeac6ed75a9dff0f9c6b980ab08517c3366606734ba05f55ee1ca1429d9bedec"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.825676 4585 generic.go:334] "Generic (PLEG): container finished" podID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerID="ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad" exitCode=0 Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.825706 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hklp2" event={"ID":"89311b3a-05ab-45c7-a6df-e8f671bdab4e","Type":"ContainerDied","Data":"ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.827932 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-87pkc" event={"ID":"2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9","Type":"ContainerStarted","Data":"4a1457d27432768b54a7cfee4166666cb1c909fe17a629f8dc2961a823acc33b"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.828091 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-87pkc" Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.833210 4585 generic.go:334] "Generic (PLEG): container finished" podID="50cd83ac-87a5-46e8-be00-9b8cf954efe0" containerID="98a5b9fffe779c985d8184b32a7a88aca468ae30a583e63785b1da7fea60d1bb" exitCode=0 Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.833440 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hzct7" event={"ID":"50cd83ac-87a5-46e8-be00-9b8cf954efe0","Type":"ContainerDied","Data":"98a5b9fffe779c985d8184b32a7a88aca468ae30a583e63785b1da7fea60d1bb"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.880766 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.883863 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a02ef963-4b80-427c-9f23-c033a729b944","Type":"ContainerStarted","Data":"b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.885585 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c","Type":"ContainerStarted","Data":"4b879de4decf831f419c094a8f047bc8a77b97982aec046d24e23fdc8e0188c0"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.891790 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jr8n" event={"ID":"0e2147ab-2614-4b64-9f18-a45bec1f4937","Type":"ContainerStarted","Data":"d2a01168e4e4a5d2788b0cae67b14d6557c565c8217471f1fc02ea8aa4bc3550"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.901314 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"66d1b09a-b816-4f0f-be98-6963462597ab","Type":"ContainerStarted","Data":"90b643fc1f6bdc7fe55e5b6382f9e11f4d06fcd3f393783e5f3574bd7a6f9de9"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.902146 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.908013 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"858caedb-5be5-49ce-b806-489e3c0531b5","Type":"ContainerStarted","Data":"02f77ed14336a07733d4ef3f1b742680d1ef16e2a72744baa6a146892b6273ef"} Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.908342 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-87pkc" podStartSLOduration=20.565984808 podStartE2EDuration="25.908318545s" podCreationTimestamp="2026-02-15 17:20:13 +0000 UTC" firstStartedPulling="2026-02-15 17:20:31.537971063 +0000 UTC m=+887.481379195" lastFinishedPulling="2026-02-15 17:20:36.88030479 +0000 UTC m=+892.823712932" observedRunningTime="2026-02-15 17:20:38.899200777 +0000 UTC m=+894.842608909" watchObservedRunningTime="2026-02-15 17:20:38.908318545 +0000 UTC m=+894.851726677" Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.928428 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=23.565220926 podStartE2EDuration="29.928413293s" podCreationTimestamp="2026-02-15 17:20:09 +0000 UTC" firstStartedPulling="2026-02-15 17:20:31.284354019 +0000 UTC m=+887.227762141" lastFinishedPulling="2026-02-15 17:20:37.647546376 +0000 UTC m=+893.590954508" observedRunningTime="2026-02-15 17:20:38.910366581 +0000 UTC m=+894.853774713" watchObservedRunningTime="2026-02-15 17:20:38.928413293 +0000 UTC m=+894.871821425" Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.957341 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=11.210392367 podStartE2EDuration="34.957326511s" podCreationTimestamp="2026-02-15 17:20:04 +0000 UTC" firstStartedPulling="2026-02-15 17:20:06.767942588 +0000 UTC m=+862.711350720" lastFinishedPulling="2026-02-15 17:20:30.514876732 +0000 UTC m=+886.458284864" observedRunningTime="2026-02-15 17:20:38.955696797 +0000 UTC m=+894.899104929" watchObservedRunningTime="2026-02-15 17:20:38.957326511 +0000 UTC m=+894.900734643" Feb 15 17:20:38 crc kubenswrapper[4585]: I0215 17:20:38.977285 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.064564256 podStartE2EDuration="31.977266135s" podCreationTimestamp="2026-02-15 17:20:07 +0000 UTC" firstStartedPulling="2026-02-15 17:20:28.294572384 +0000 UTC m=+884.237980516" lastFinishedPulling="2026-02-15 17:20:36.207274263 +0000 UTC m=+892.150682395" observedRunningTime="2026-02-15 17:20:38.972157476 +0000 UTC m=+894.915565598" watchObservedRunningTime="2026-02-15 17:20:38.977266135 +0000 UTC m=+894.920674267" Feb 15 17:20:39 crc kubenswrapper[4585]: I0215 17:20:39.000332 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.331283991 podStartE2EDuration="33.000313593s" podCreationTimestamp="2026-02-15 17:20:06 +0000 UTC" firstStartedPulling="2026-02-15 17:20:15.920715701 +0000 UTC m=+871.864123843" lastFinishedPulling="2026-02-15 17:20:30.589745313 +0000 UTC m=+886.533153445" observedRunningTime="2026-02-15 17:20:38.993837416 +0000 UTC m=+894.937245548" watchObservedRunningTime="2026-02-15 17:20:39.000313593 +0000 UTC m=+894.943721725" Feb 15 17:20:39 crc kubenswrapper[4585]: I0215 17:20:39.917306 4585 generic.go:334] "Generic (PLEG): container finished" podID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerID="d2a01168e4e4a5d2788b0cae67b14d6557c565c8217471f1fc02ea8aa4bc3550" exitCode=0 Feb 15 17:20:39 crc kubenswrapper[4585]: I0215 17:20:39.917802 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jr8n" event={"ID":"0e2147ab-2614-4b64-9f18-a45bec1f4937","Type":"ContainerDied","Data":"d2a01168e4e4a5d2788b0cae67b14d6557c565c8217471f1fc02ea8aa4bc3550"} Feb 15 17:20:39 crc kubenswrapper[4585]: I0215 17:20:39.934695 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hzct7" event={"ID":"50cd83ac-87a5-46e8-be00-9b8cf954efe0","Type":"ContainerStarted","Data":"cef16d4a80627eab513dc5b5dbdbddb1f0fcfb2bd1fed44fbf027ffb6eec49cf"} Feb 15 17:20:40 crc kubenswrapper[4585]: I0215 17:20:40.954011 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ba94e4cb-032f-451a-afd7-f908bef47709","Type":"ContainerStarted","Data":"218c9931edc364875e5d758c00f01a6f2f96e3cbeee50521186b97759cac3859"} Feb 15 17:20:40 crc kubenswrapper[4585]: I0215 17:20:40.963743 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hklp2" event={"ID":"89311b3a-05ab-45c7-a6df-e8f671bdab4e","Type":"ContainerStarted","Data":"faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1"} Feb 15 17:20:40 crc kubenswrapper[4585]: I0215 17:20:40.967446 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hzct7" event={"ID":"50cd83ac-87a5-46e8-be00-9b8cf954efe0","Type":"ContainerStarted","Data":"fc21b84a43e10e59971d7baed44a5f6816be45e7d2b54eaf1b2615363b2587ee"} Feb 15 17:20:40 crc kubenswrapper[4585]: I0215 17:20:40.967657 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:40 crc kubenswrapper[4585]: I0215 17:20:40.967682 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:20:40 crc kubenswrapper[4585]: I0215 17:20:40.969128 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e892e065-8113-4612-a7f6-808490c8b000","Type":"ContainerStarted","Data":"38aaa155d537e4d211bc5a431126527b132fb703954afcfff1d2e580aca41bb5"} Feb 15 17:20:41 crc kubenswrapper[4585]: I0215 17:20:41.013876 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.959484596 podStartE2EDuration="26.013855675s" podCreationTimestamp="2026-02-15 17:20:15 +0000 UTC" firstStartedPulling="2026-02-15 17:20:32.664445451 +0000 UTC m=+888.607853583" lastFinishedPulling="2026-02-15 17:20:39.71881653 +0000 UTC m=+895.662224662" observedRunningTime="2026-02-15 17:20:41.01112914 +0000 UTC m=+896.954537282" watchObservedRunningTime="2026-02-15 17:20:41.013855675 +0000 UTC m=+896.957263807" Feb 15 17:20:41 crc kubenswrapper[4585]: I0215 17:20:41.020208 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.977813836 podStartE2EDuration="29.020192607s" podCreationTimestamp="2026-02-15 17:20:12 +0000 UTC" firstStartedPulling="2026-02-15 17:20:33.715321299 +0000 UTC m=+889.658729431" lastFinishedPulling="2026-02-15 17:20:39.75770007 +0000 UTC m=+895.701108202" observedRunningTime="2026-02-15 17:20:40.993290233 +0000 UTC m=+896.936698365" watchObservedRunningTime="2026-02-15 17:20:41.020192607 +0000 UTC m=+896.963600729" Feb 15 17:20:41 crc kubenswrapper[4585]: I0215 17:20:41.038011 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hklp2" podStartSLOduration=4.930043554 podStartE2EDuration="11.037994802s" podCreationTimestamp="2026-02-15 17:20:30 +0000 UTC" firstStartedPulling="2026-02-15 17:20:33.710789196 +0000 UTC m=+889.654197318" lastFinishedPulling="2026-02-15 17:20:39.818740434 +0000 UTC m=+895.762148566" observedRunningTime="2026-02-15 17:20:41.033575382 +0000 UTC m=+896.976983504" watchObservedRunningTime="2026-02-15 17:20:41.037994802 +0000 UTC m=+896.981402934" Feb 15 17:20:41 crc kubenswrapper[4585]: I0215 17:20:41.061376 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hzct7" podStartSLOduration=23.341211374 podStartE2EDuration="28.06135701s" podCreationTimestamp="2026-02-15 17:20:13 +0000 UTC" firstStartedPulling="2026-02-15 17:20:32.035764753 +0000 UTC m=+887.979172885" lastFinishedPulling="2026-02-15 17:20:36.755910369 +0000 UTC m=+892.699318521" observedRunningTime="2026-02-15 17:20:41.055991543 +0000 UTC m=+896.999399675" watchObservedRunningTime="2026-02-15 17:20:41.06135701 +0000 UTC m=+897.004765142" Feb 15 17:20:41 crc kubenswrapper[4585]: I0215 17:20:41.344840 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:41 crc kubenswrapper[4585]: I0215 17:20:41.345106 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:41 crc kubenswrapper[4585]: I0215 17:20:41.846148 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:41 crc kubenswrapper[4585]: I0215 17:20:41.978228 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jr8n" event={"ID":"0e2147ab-2614-4b64-9f18-a45bec1f4937","Type":"ContainerStarted","Data":"1cbdd76baa1eb5f1731d6420e1f3c4c9731ed2da2cf613832406017084f45b0e"} Feb 15 17:20:42 crc kubenswrapper[4585]: I0215 17:20:42.003432 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8jr8n" podStartSLOduration=14.739032807 podStartE2EDuration="22.003411351s" podCreationTimestamp="2026-02-15 17:20:20 +0000 UTC" firstStartedPulling="2026-02-15 17:20:33.71093411 +0000 UTC m=+889.654342242" lastFinishedPulling="2026-02-15 17:20:40.975312654 +0000 UTC m=+896.918720786" observedRunningTime="2026-02-15 17:20:41.995942717 +0000 UTC m=+897.939350839" watchObservedRunningTime="2026-02-15 17:20:42.003411351 +0000 UTC m=+897.946819483" Feb 15 17:20:42 crc kubenswrapper[4585]: I0215 17:20:42.388444 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-hklp2" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerName="registry-server" probeResult="failure" output=< Feb 15 17:20:42 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:20:42 crc kubenswrapper[4585]: > Feb 15 17:20:43 crc kubenswrapper[4585]: I0215 17:20:43.722232 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:43 crc kubenswrapper[4585]: I0215 17:20:43.722626 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:43 crc kubenswrapper[4585]: I0215 17:20:43.840726 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:43 crc kubenswrapper[4585]: I0215 17:20:43.847127 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:43 crc kubenswrapper[4585]: I0215 17:20:43.897025 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:43 crc kubenswrapper[4585]: I0215 17:20:43.997311 4585 generic.go:334] "Generic (PLEG): container finished" podID="e74925c2-940d-48c3-853b-a0664af9b31d" containerID="4e508be139df2d023ec26d45045abda460004ae1a271515346ce58b63b55e00f" exitCode=0 Feb 15 17:20:43 crc kubenswrapper[4585]: I0215 17:20:43.997410 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" event={"ID":"e74925c2-940d-48c3-853b-a0664af9b31d","Type":"ContainerDied","Data":"4e508be139df2d023ec26d45045abda460004ae1a271515346ce58b63b55e00f"} Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:43.999428 4585 generic.go:334] "Generic (PLEG): container finished" podID="22360463-bc5c-4e70-abbe-710820b7344b" containerID="7c11368d3313342f16cf24ea380336caa3bdae61bcf493cd11cda9b43283083f" exitCode=0 Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:43.999511 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" event={"ID":"22360463-bc5c-4e70-abbe-710820b7344b","Type":"ContainerDied","Data":"7c11368d3313342f16cf24ea380336caa3bdae61bcf493cd11cda9b43283083f"} Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.070146 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.161059 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.387586 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4bfs4"] Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.413299 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hkzs9"] Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.418247 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.420256 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.429557 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hkzs9"] Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.469842 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tszc8\" (UniqueName: \"kubernetes.io/projected/62bf19dd-0faa-4e78-b944-ef44f34d3a66-kube-api-access-tszc8\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.469893 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.469960 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-config\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.469991 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.574775 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tszc8\" (UniqueName: \"kubernetes.io/projected/62bf19dd-0faa-4e78-b944-ef44f34d3a66-kube-api-access-tszc8\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.574821 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.574866 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-config\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.574898 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.576021 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.576077 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-config\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.577470 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.596023 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tszc8\" (UniqueName: \"kubernetes.io/projected/62bf19dd-0faa-4e78-b944-ef44f34d3a66-kube-api-access-tszc8\") pod \"dnsmasq-dns-6bc7876d45-hkzs9\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.622997 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ntdrw"] Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.624917 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.628204 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.644648 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ntdrw"] Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.676465 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a706b455-b1aa-4b2d-9ee3-714cb8801089-ovn-rundir\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.676525 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a706b455-b1aa-4b2d-9ee3-714cb8801089-config\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.676572 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kg2g\" (UniqueName: \"kubernetes.io/projected/a706b455-b1aa-4b2d-9ee3-714cb8801089-kube-api-access-9kg2g\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.676614 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a706b455-b1aa-4b2d-9ee3-714cb8801089-combined-ca-bundle\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.676646 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a706b455-b1aa-4b2d-9ee3-714cb8801089-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.676719 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a706b455-b1aa-4b2d-9ee3-714cb8801089-ovs-rundir\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.697519 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.704726 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.711419 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lqzkx" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.711618 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.713742 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbbjr"] Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.722795 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.723524 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.740111 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.758724 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.856560 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kg2g\" (UniqueName: \"kubernetes.io/projected/a706b455-b1aa-4b2d-9ee3-714cb8801089-kube-api-access-9kg2g\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857127 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a706b455-b1aa-4b2d-9ee3-714cb8801089-combined-ca-bundle\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857172 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25a62375-a3a2-44ec-b5e3-e03e3da6257e-scripts\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857204 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a62375-a3a2-44ec-b5e3-e03e3da6257e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857226 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a706b455-b1aa-4b2d-9ee3-714cb8801089-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857245 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a62375-a3a2-44ec-b5e3-e03e3da6257e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857273 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbl9l\" (UniqueName: \"kubernetes.io/projected/25a62375-a3a2-44ec-b5e3-e03e3da6257e-kube-api-access-wbl9l\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857321 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a706b455-b1aa-4b2d-9ee3-714cb8801089-ovs-rundir\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857359 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a62375-a3a2-44ec-b5e3-e03e3da6257e-config\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857378 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a62375-a3a2-44ec-b5e3-e03e3da6257e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857419 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25a62375-a3a2-44ec-b5e3-e03e3da6257e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857441 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a706b455-b1aa-4b2d-9ee3-714cb8801089-ovn-rundir\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.857476 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a706b455-b1aa-4b2d-9ee3-714cb8801089-config\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.885633 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a706b455-b1aa-4b2d-9ee3-714cb8801089-combined-ca-bundle\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.907334 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.907515 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a706b455-b1aa-4b2d-9ee3-714cb8801089-ovs-rundir\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.907575 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a706b455-b1aa-4b2d-9ee3-714cb8801089-ovn-rundir\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.932745 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a706b455-b1aa-4b2d-9ee3-714cb8801089-config\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.936052 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a706b455-b1aa-4b2d-9ee3-714cb8801089-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.936858 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kg2g\" (UniqueName: \"kubernetes.io/projected/a706b455-b1aa-4b2d-9ee3-714cb8801089-kube-api-access-9kg2g\") pod \"ovn-controller-metrics-ntdrw\" (UID: \"a706b455-b1aa-4b2d-9ee3-714cb8801089\") " pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.954280 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ntdrw" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.965352 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a62375-a3a2-44ec-b5e3-e03e3da6257e-config\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.965581 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a62375-a3a2-44ec-b5e3-e03e3da6257e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.965735 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25a62375-a3a2-44ec-b5e3-e03e3da6257e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.965867 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25a62375-a3a2-44ec-b5e3-e03e3da6257e-scripts\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.966643 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a62375-a3a2-44ec-b5e3-e03e3da6257e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.989427 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a62375-a3a2-44ec-b5e3-e03e3da6257e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.989482 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbl9l\" (UniqueName: \"kubernetes.io/projected/25a62375-a3a2-44ec-b5e3-e03e3da6257e-kube-api-access-wbl9l\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.989348 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a62375-a3a2-44ec-b5e3-e03e3da6257e-config\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.969258 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/25a62375-a3a2-44ec-b5e3-e03e3da6257e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.976657 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.976805 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.976852 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 15 17:20:44 crc kubenswrapper[4585]: I0215 17:20:44.990822 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25a62375-a3a2-44ec-b5e3-e03e3da6257e-scripts\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.087431 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a62375-a3a2-44ec-b5e3-e03e3da6257e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.118322 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a62375-a3a2-44ec-b5e3-e03e3da6257e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.123227 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-ds72g"] Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.124661 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ds72g"] Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.124735 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.125206 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/25a62375-a3a2-44ec-b5e3-e03e3da6257e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.130235 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbl9l\" (UniqueName: \"kubernetes.io/projected/25a62375-a3a2-44ec-b5e3-e03e3da6257e-kube-api-access-wbl9l\") pod \"ovn-northd-0\" (UID: \"25a62375-a3a2-44ec-b5e3-e03e3da6257e\") " pod="openstack/ovn-northd-0" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.147317 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.236972 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" event={"ID":"e74925c2-940d-48c3-853b-a0664af9b31d","Type":"ContainerStarted","Data":"cdbb66cb091455ddefed2ff5b0babd8a86555764353f4ff37e786b94efb3778e"} Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.237125 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" podUID="e74925c2-940d-48c3-853b-a0664af9b31d" containerName="dnsmasq-dns" containerID="cri-o://cdbb66cb091455ddefed2ff5b0babd8a86555764353f4ff37e786b94efb3778e" gracePeriod=10 Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.237365 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.264050 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9de65e3c-3874-4fc0-9566-84138bb228b7","Type":"ContainerStarted","Data":"2dc9046518fb8df2747e6c9324c0f5f19fc85c8b9ac0bc03342da2ba06ce681d"} Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.295707 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" podStartSLOduration=4.13337341 podStartE2EDuration="42.295688301s" podCreationTimestamp="2026-02-15 17:20:03 +0000 UTC" firstStartedPulling="2026-02-15 17:20:04.57882426 +0000 UTC m=+860.522232392" lastFinishedPulling="2026-02-15 17:20:42.741139151 +0000 UTC m=+898.684547283" observedRunningTime="2026-02-15 17:20:45.279209751 +0000 UTC m=+901.222617883" watchObservedRunningTime="2026-02-15 17:20:45.295688301 +0000 UTC m=+901.239096433" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.310611 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" podUID="22360463-bc5c-4e70-abbe-710820b7344b" containerName="dnsmasq-dns" containerID="cri-o://5d1d4eaf27b9116256603757467a8d1aeb3c681f4359fbc74a89aaa8def03244" gracePeriod=10 Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.310856 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" event={"ID":"22360463-bc5c-4e70-abbe-710820b7344b","Type":"ContainerStarted","Data":"5d1d4eaf27b9116256603757467a8d1aeb3c681f4359fbc74a89aaa8def03244"} Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.311289 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.323649 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.323739 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68fh\" (UniqueName: \"kubernetes.io/projected/2b2bae3d-b010-43cc-ad44-5573802b4517-kube-api-access-z68fh\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.323774 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.323795 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-config\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.323823 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-dns-svc\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.325636 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lqzkx" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.333552 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.389753 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" podStartSLOduration=-9223371994.465042 podStartE2EDuration="42.389734325s" podCreationTimestamp="2026-02-15 17:20:03 +0000 UTC" firstStartedPulling="2026-02-15 17:20:04.913397442 +0000 UTC m=+860.856805574" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:20:45.382459126 +0000 UTC m=+901.325867258" watchObservedRunningTime="2026-02-15 17:20:45.389734325 +0000 UTC m=+901.333142457" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.425556 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.425987 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z68fh\" (UniqueName: \"kubernetes.io/projected/2b2bae3d-b010-43cc-ad44-5573802b4517-kube-api-access-z68fh\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.426058 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.426090 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-config\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.426138 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-dns-svc\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.426723 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.427141 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hkzs9"] Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.427277 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.427737 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-dns-svc\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.429194 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-config\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.451338 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z68fh\" (UniqueName: \"kubernetes.io/projected/2b2bae3d-b010-43cc-ad44-5573802b4517-kube-api-access-z68fh\") pod \"dnsmasq-dns-8554648995-ds72g\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.462195 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:45 crc kubenswrapper[4585]: I0215 17:20:45.751909 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ntdrw"] Feb 15 17:20:45 crc kubenswrapper[4585]: W0215 17:20:45.994911 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62bf19dd_0faa_4e78_b944_ef44f34d3a66.slice/crio-1458203a2044069639071acbcfce48d9234ed63d9a25278c76cda8d2e93d747c WatchSource:0}: Error finding container 1458203a2044069639071acbcfce48d9234ed63d9a25278c76cda8d2e93d747c: Status 404 returned error can't find the container with id 1458203a2044069639071acbcfce48d9234ed63d9a25278c76cda8d2e93d747c Feb 15 17:20:46 crc kubenswrapper[4585]: W0215 17:20:46.014764 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda706b455_b1aa_4b2d_9ee3_714cb8801089.slice/crio-99f5fe7592bb6328a9cf48ba4f03ea2845bf45943bc097a07b967d36317eb3e3 WatchSource:0}: Error finding container 99f5fe7592bb6328a9cf48ba4f03ea2845bf45943bc097a07b967d36317eb3e3: Status 404 returned error can't find the container with id 99f5fe7592bb6328a9cf48ba4f03ea2845bf45943bc097a07b967d36317eb3e3 Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.122771 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.123052 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.242352 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.340583 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ntdrw" event={"ID":"a706b455-b1aa-4b2d-9ee3-714cb8801089","Type":"ContainerStarted","Data":"99f5fe7592bb6328a9cf48ba4f03ea2845bf45943bc097a07b967d36317eb3e3"} Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.350207 4585 generic.go:334] "Generic (PLEG): container finished" podID="22360463-bc5c-4e70-abbe-710820b7344b" containerID="5d1d4eaf27b9116256603757467a8d1aeb3c681f4359fbc74a89aaa8def03244" exitCode=0 Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.350294 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" event={"ID":"22360463-bc5c-4e70-abbe-710820b7344b","Type":"ContainerDied","Data":"5d1d4eaf27b9116256603757467a8d1aeb3c681f4359fbc74a89aaa8def03244"} Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.360526 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" event={"ID":"62bf19dd-0faa-4e78-b944-ef44f34d3a66","Type":"ContainerStarted","Data":"1458203a2044069639071acbcfce48d9234ed63d9a25278c76cda8d2e93d747c"} Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.364880 4585 generic.go:334] "Generic (PLEG): container finished" podID="e74925c2-940d-48c3-853b-a0664af9b31d" containerID="cdbb66cb091455ddefed2ff5b0babd8a86555764353f4ff37e786b94efb3778e" exitCode=0 Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.365793 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" event={"ID":"e74925c2-940d-48c3-853b-a0664af9b31d","Type":"ContainerDied","Data":"cdbb66cb091455ddefed2ff5b0babd8a86555764353f4ff37e786b94efb3778e"} Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.500907 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.714074 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 15 17:20:46 crc kubenswrapper[4585]: W0215 17:20:46.718365 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25a62375_a3a2_44ec_b5e3_e03e3da6257e.slice/crio-bf20c75d9c7f19ed57a13a723ea7af1c686a2d314de1faaa6797ca0277bb7463 WatchSource:0}: Error finding container bf20c75d9c7f19ed57a13a723ea7af1c686a2d314de1faaa6797ca0277bb7463: Status 404 returned error can't find the container with id bf20c75d9c7f19ed57a13a723ea7af1c686a2d314de1faaa6797ca0277bb7463 Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.773294 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.836188 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.836453 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ds72g"] Feb 15 17:20:46 crc kubenswrapper[4585]: W0215 17:20:46.860284 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b2bae3d_b010_43cc_ad44_5573802b4517.slice/crio-a764b55e48a5aa48e7eb67f339a9df13e872c390762e6f4551b20dc2dfe5fb00 WatchSource:0}: Error finding container a764b55e48a5aa48e7eb67f339a9df13e872c390762e6f4551b20dc2dfe5fb00: Status 404 returned error can't find the container with id a764b55e48a5aa48e7eb67f339a9df13e872c390762e6f4551b20dc2dfe5fb00 Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.972690 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw7nn\" (UniqueName: \"kubernetes.io/projected/e74925c2-940d-48c3-853b-a0664af9b31d-kube-api-access-dw7nn\") pod \"e74925c2-940d-48c3-853b-a0664af9b31d\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.972760 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-dns-svc\") pod \"22360463-bc5c-4e70-abbe-710820b7344b\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.972785 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-dns-svc\") pod \"e74925c2-940d-48c3-853b-a0664af9b31d\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.976630 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkx5c\" (UniqueName: \"kubernetes.io/projected/22360463-bc5c-4e70-abbe-710820b7344b-kube-api-access-tkx5c\") pod \"22360463-bc5c-4e70-abbe-710820b7344b\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.977026 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-config\") pod \"e74925c2-940d-48c3-853b-a0664af9b31d\" (UID: \"e74925c2-940d-48c3-853b-a0664af9b31d\") " Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.977104 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-config\") pod \"22360463-bc5c-4e70-abbe-710820b7344b\" (UID: \"22360463-bc5c-4e70-abbe-710820b7344b\") " Feb 15 17:20:46 crc kubenswrapper[4585]: I0215 17:20:46.985828 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e74925c2-940d-48c3-853b-a0664af9b31d-kube-api-access-dw7nn" (OuterVolumeSpecName: "kube-api-access-dw7nn") pod "e74925c2-940d-48c3-853b-a0664af9b31d" (UID: "e74925c2-940d-48c3-853b-a0664af9b31d"). InnerVolumeSpecName "kube-api-access-dw7nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.001088 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22360463-bc5c-4e70-abbe-710820b7344b-kube-api-access-tkx5c" (OuterVolumeSpecName: "kube-api-access-tkx5c") pod "22360463-bc5c-4e70-abbe-710820b7344b" (UID: "22360463-bc5c-4e70-abbe-710820b7344b"). InnerVolumeSpecName "kube-api-access-tkx5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.013783 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.013825 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.043290 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e74925c2-940d-48c3-853b-a0664af9b31d" (UID: "e74925c2-940d-48c3-853b-a0664af9b31d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.046628 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-config" (OuterVolumeSpecName: "config") pod "22360463-bc5c-4e70-abbe-710820b7344b" (UID: "22360463-bc5c-4e70-abbe-710820b7344b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.062841 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-config" (OuterVolumeSpecName: "config") pod "e74925c2-940d-48c3-853b-a0664af9b31d" (UID: "e74925c2-940d-48c3-853b-a0664af9b31d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.084252 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkx5c\" (UniqueName: \"kubernetes.io/projected/22360463-bc5c-4e70-abbe-710820b7344b-kube-api-access-tkx5c\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.084292 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.084307 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.084320 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e74925c2-940d-48c3-853b-a0664af9b31d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.084331 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw7nn\" (UniqueName: \"kubernetes.io/projected/e74925c2-940d-48c3-853b-a0664af9b31d-kube-api-access-dw7nn\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.090164 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22360463-bc5c-4e70-abbe-710820b7344b" (UID: "22360463-bc5c-4e70-abbe-710820b7344b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.185982 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22360463-bc5c-4e70-abbe-710820b7344b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.376713 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25a62375-a3a2-44ec-b5e3-e03e3da6257e","Type":"ContainerStarted","Data":"bf20c75d9c7f19ed57a13a723ea7af1c686a2d314de1faaa6797ca0277bb7463"} Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.380574 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" event={"ID":"22360463-bc5c-4e70-abbe-710820b7344b","Type":"ContainerDied","Data":"845e0eccadb98a41ae7c7c61cc2f946ce6e014dfe3646021f983665fe97c6d18"} Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.380638 4585 scope.go:117] "RemoveContainer" containerID="5d1d4eaf27b9116256603757467a8d1aeb3c681f4359fbc74a89aaa8def03244" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.380608 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4bfs4" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.397727 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" event={"ID":"e74925c2-940d-48c3-853b-a0664af9b31d","Type":"ContainerDied","Data":"c6c32d025092318060cb7a5d000a10a7454b6fae75911672be7661a93373ce9b"} Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.397846 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jbbjr" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.413135 4585 generic.go:334] "Generic (PLEG): container finished" podID="62bf19dd-0faa-4e78-b944-ef44f34d3a66" containerID="336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135" exitCode=0 Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.413220 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" event={"ID":"62bf19dd-0faa-4e78-b944-ef44f34d3a66","Type":"ContainerDied","Data":"336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135"} Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.425816 4585 generic.go:334] "Generic (PLEG): container finished" podID="2b2bae3d-b010-43cc-ad44-5573802b4517" containerID="f634dd15c25ca67e29a95602961918227f95f77fde20cd8620915696830b8ee5" exitCode=0 Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.425886 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ds72g" event={"ID":"2b2bae3d-b010-43cc-ad44-5573802b4517","Type":"ContainerDied","Data":"f634dd15c25ca67e29a95602961918227f95f77fde20cd8620915696830b8ee5"} Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.425913 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ds72g" event={"ID":"2b2bae3d-b010-43cc-ad44-5573802b4517","Type":"ContainerStarted","Data":"a764b55e48a5aa48e7eb67f339a9df13e872c390762e6f4551b20dc2dfe5fb00"} Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.427215 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4bfs4"] Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.430609 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ntdrw" event={"ID":"a706b455-b1aa-4b2d-9ee3-714cb8801089","Type":"ContainerStarted","Data":"bb14f9c4f9b2be60eff40f5c8e09602599c23199ab59ecfb6f38d781991f6fcf"} Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.433935 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4bfs4"] Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.462532 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbbjr"] Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.478630 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbbjr"] Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.482899 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.482930 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.491830 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ntdrw" podStartSLOduration=3.491810679 podStartE2EDuration="3.491810679s" podCreationTimestamp="2026-02-15 17:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:20:47.471885626 +0000 UTC m=+903.415293758" watchObservedRunningTime="2026-02-15 17:20:47.491810679 +0000 UTC m=+903.435218811" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.599245 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.899343 4585 scope.go:117] "RemoveContainer" containerID="7c11368d3313342f16cf24ea380336caa3bdae61bcf493cd11cda9b43283083f" Feb 15 17:20:47 crc kubenswrapper[4585]: I0215 17:20:47.918867 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.058786 4585 scope.go:117] "RemoveContainer" containerID="cdbb66cb091455ddefed2ff5b0babd8a86555764353f4ff37e786b94efb3778e" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.120736 4585 scope.go:117] "RemoveContainer" containerID="4e508be139df2d023ec26d45045abda460004ae1a271515346ce58b63b55e00f" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.370210 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b1da-account-create-update-nqw28"] Feb 15 17:20:48 crc kubenswrapper[4585]: E0215 17:20:48.371098 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22360463-bc5c-4e70-abbe-710820b7344b" containerName="dnsmasq-dns" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.371169 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="22360463-bc5c-4e70-abbe-710820b7344b" containerName="dnsmasq-dns" Feb 15 17:20:48 crc kubenswrapper[4585]: E0215 17:20:48.371234 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74925c2-940d-48c3-853b-a0664af9b31d" containerName="dnsmasq-dns" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.371284 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74925c2-940d-48c3-853b-a0664af9b31d" containerName="dnsmasq-dns" Feb 15 17:20:48 crc kubenswrapper[4585]: E0215 17:20:48.371337 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22360463-bc5c-4e70-abbe-710820b7344b" containerName="init" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.371382 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="22360463-bc5c-4e70-abbe-710820b7344b" containerName="init" Feb 15 17:20:48 crc kubenswrapper[4585]: E0215 17:20:48.371443 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74925c2-940d-48c3-853b-a0664af9b31d" containerName="init" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.371504 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74925c2-940d-48c3-853b-a0664af9b31d" containerName="init" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.371798 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="22360463-bc5c-4e70-abbe-710820b7344b" containerName="dnsmasq-dns" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.371882 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74925c2-940d-48c3-853b-a0664af9b31d" containerName="dnsmasq-dns" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.372665 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.377728 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.387986 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b1da-account-create-update-nqw28"] Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.438041 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-w957z"] Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.439555 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w957z" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.458891 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-w957z"] Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.459902 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" event={"ID":"62bf19dd-0faa-4e78-b944-ef44f34d3a66","Type":"ContainerStarted","Data":"010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d"} Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.460134 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.472043 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ds72g" event={"ID":"2b2bae3d-b010-43cc-ad44-5573802b4517","Type":"ContainerStarted","Data":"8398705ed3c434e0f19b2d22e75c60b15b7c5aeef5327e7e52380ac7257c20bc"} Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.477728 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.477747 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25a62375-a3a2-44ec-b5e3-e03e3da6257e","Type":"ContainerStarted","Data":"675b7e31bf030169b257e1f0ca46e87926c437be1241c7ef9c56cf5d1c58a4d7"} Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.508719 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" podStartSLOduration=4.508701971 podStartE2EDuration="4.508701971s" podCreationTimestamp="2026-02-15 17:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:20:48.487368829 +0000 UTC m=+904.430776961" watchObservedRunningTime="2026-02-15 17:20:48.508701971 +0000 UTC m=+904.452110103" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.517644 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-ds72g" podStartSLOduration=4.517622054 podStartE2EDuration="4.517622054s" podCreationTimestamp="2026-02-15 17:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:20:48.50427619 +0000 UTC m=+904.447684342" watchObservedRunningTime="2026-02-15 17:20:48.517622054 +0000 UTC m=+904.461030186" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.517987 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x4pf\" (UniqueName: \"kubernetes.io/projected/7ab2a2c2-523c-43c3-9776-902776a34ad4-kube-api-access-8x4pf\") pod \"glance-db-create-w957z\" (UID: \"7ab2a2c2-523c-43c3-9776-902776a34ad4\") " pod="openstack/glance-db-create-w957z" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.518120 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ab2a2c2-523c-43c3-9776-902776a34ad4-operator-scripts\") pod \"glance-db-create-w957z\" (UID: \"7ab2a2c2-523c-43c3-9776-902776a34ad4\") " pod="openstack/glance-db-create-w957z" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.518403 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224730b2-4e83-42a8-b057-11c1ae0fd14f-operator-scripts\") pod \"glance-b1da-account-create-update-nqw28\" (UID: \"224730b2-4e83-42a8-b057-11c1ae0fd14f\") " pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.518525 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wbjz\" (UniqueName: \"kubernetes.io/projected/224730b2-4e83-42a8-b057-11c1ae0fd14f-kube-api-access-7wbjz\") pod \"glance-b1da-account-create-update-nqw28\" (UID: \"224730b2-4e83-42a8-b057-11c1ae0fd14f\") " pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.598444 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.621144 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x4pf\" (UniqueName: \"kubernetes.io/projected/7ab2a2c2-523c-43c3-9776-902776a34ad4-kube-api-access-8x4pf\") pod \"glance-db-create-w957z\" (UID: \"7ab2a2c2-523c-43c3-9776-902776a34ad4\") " pod="openstack/glance-db-create-w957z" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.621227 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ab2a2c2-523c-43c3-9776-902776a34ad4-operator-scripts\") pod \"glance-db-create-w957z\" (UID: \"7ab2a2c2-523c-43c3-9776-902776a34ad4\") " pod="openstack/glance-db-create-w957z" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.621276 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224730b2-4e83-42a8-b057-11c1ae0fd14f-operator-scripts\") pod \"glance-b1da-account-create-update-nqw28\" (UID: \"224730b2-4e83-42a8-b057-11c1ae0fd14f\") " pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.621322 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wbjz\" (UniqueName: \"kubernetes.io/projected/224730b2-4e83-42a8-b057-11c1ae0fd14f-kube-api-access-7wbjz\") pod \"glance-b1da-account-create-update-nqw28\" (UID: \"224730b2-4e83-42a8-b057-11c1ae0fd14f\") " pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.622531 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ab2a2c2-523c-43c3-9776-902776a34ad4-operator-scripts\") pod \"glance-db-create-w957z\" (UID: \"7ab2a2c2-523c-43c3-9776-902776a34ad4\") " pod="openstack/glance-db-create-w957z" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.623022 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224730b2-4e83-42a8-b057-11c1ae0fd14f-operator-scripts\") pod \"glance-b1da-account-create-update-nqw28\" (UID: \"224730b2-4e83-42a8-b057-11c1ae0fd14f\") " pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.645669 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x4pf\" (UniqueName: \"kubernetes.io/projected/7ab2a2c2-523c-43c3-9776-902776a34ad4-kube-api-access-8x4pf\") pod \"glance-db-create-w957z\" (UID: \"7ab2a2c2-523c-43c3-9776-902776a34ad4\") " pod="openstack/glance-db-create-w957z" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.649148 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wbjz\" (UniqueName: \"kubernetes.io/projected/224730b2-4e83-42a8-b057-11c1ae0fd14f-kube-api-access-7wbjz\") pod \"glance-b1da-account-create-update-nqw28\" (UID: \"224730b2-4e83-42a8-b057-11c1ae0fd14f\") " pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.688014 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.772137 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w957z" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.853281 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22360463-bc5c-4e70-abbe-710820b7344b" path="/var/lib/kubelet/pods/22360463-bc5c-4e70-abbe-710820b7344b/volumes" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.853931 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e74925c2-940d-48c3-853b-a0664af9b31d" path="/var/lib/kubelet/pods/e74925c2-940d-48c3-853b-a0664af9b31d/volumes" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.865756 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rtgm2"] Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.875177 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.904824 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rtgm2"] Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.931522 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2dbc\" (UniqueName: \"kubernetes.io/projected/ece8b5dd-b54b-400c-a62a-4995d15e8763-kube-api-access-j2dbc\") pod \"keystone-db-create-rtgm2\" (UID: \"ece8b5dd-b54b-400c-a62a-4995d15e8763\") " pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.931591 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece8b5dd-b54b-400c-a62a-4995d15e8763-operator-scripts\") pod \"keystone-db-create-rtgm2\" (UID: \"ece8b5dd-b54b-400c-a62a-4995d15e8763\") " pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.980158 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fbb4-account-create-update-pc58j"] Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.981647 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:48 crc kubenswrapper[4585]: I0215 17:20:48.983799 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.009739 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fbb4-account-create-update-pc58j"] Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.033318 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2dbc\" (UniqueName: \"kubernetes.io/projected/ece8b5dd-b54b-400c-a62a-4995d15e8763-kube-api-access-j2dbc\") pod \"keystone-db-create-rtgm2\" (UID: \"ece8b5dd-b54b-400c-a62a-4995d15e8763\") " pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.033373 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece8b5dd-b54b-400c-a62a-4995d15e8763-operator-scripts\") pod \"keystone-db-create-rtgm2\" (UID: \"ece8b5dd-b54b-400c-a62a-4995d15e8763\") " pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.033414 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bs6\" (UniqueName: \"kubernetes.io/projected/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-kube-api-access-f2bs6\") pod \"keystone-fbb4-account-create-update-pc58j\" (UID: \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\") " pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.033443 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-operator-scripts\") pod \"keystone-fbb4-account-create-update-pc58j\" (UID: \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\") " pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.034794 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece8b5dd-b54b-400c-a62a-4995d15e8763-operator-scripts\") pod \"keystone-db-create-rtgm2\" (UID: \"ece8b5dd-b54b-400c-a62a-4995d15e8763\") " pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.052580 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2dbc\" (UniqueName: \"kubernetes.io/projected/ece8b5dd-b54b-400c-a62a-4995d15e8763-kube-api-access-j2dbc\") pod \"keystone-db-create-rtgm2\" (UID: \"ece8b5dd-b54b-400c-a62a-4995d15e8763\") " pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.067840 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2vh2n"] Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.069883 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.090171 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2vh2n"] Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.139645 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bs6\" (UniqueName: \"kubernetes.io/projected/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-kube-api-access-f2bs6\") pod \"keystone-fbb4-account-create-update-pc58j\" (UID: \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\") " pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.139704 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8916579d-e13d-4087-8c29-41a8ec210f9a-operator-scripts\") pod \"placement-db-create-2vh2n\" (UID: \"8916579d-e13d-4087-8c29-41a8ec210f9a\") " pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.139726 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-operator-scripts\") pod \"keystone-fbb4-account-create-update-pc58j\" (UID: \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\") " pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.139780 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcjgd\" (UniqueName: \"kubernetes.io/projected/8916579d-e13d-4087-8c29-41a8ec210f9a-kube-api-access-fcjgd\") pod \"placement-db-create-2vh2n\" (UID: \"8916579d-e13d-4087-8c29-41a8ec210f9a\") " pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.140427 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-operator-scripts\") pod \"keystone-fbb4-account-create-update-pc58j\" (UID: \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\") " pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.186354 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bs6\" (UniqueName: \"kubernetes.io/projected/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-kube-api-access-f2bs6\") pod \"keystone-fbb4-account-create-update-pc58j\" (UID: \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\") " pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.194853 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1491-account-create-update-f9fqz"] Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.196194 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.198822 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.207336 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.210497 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1491-account-create-update-f9fqz"] Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.243735 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8916579d-e13d-4087-8c29-41a8ec210f9a-operator-scripts\") pod \"placement-db-create-2vh2n\" (UID: \"8916579d-e13d-4087-8c29-41a8ec210f9a\") " pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.243802 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcjgd\" (UniqueName: \"kubernetes.io/projected/8916579d-e13d-4087-8c29-41a8ec210f9a-kube-api-access-fcjgd\") pod \"placement-db-create-2vh2n\" (UID: \"8916579d-e13d-4087-8c29-41a8ec210f9a\") " pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.244514 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8916579d-e13d-4087-8c29-41a8ec210f9a-operator-scripts\") pod \"placement-db-create-2vh2n\" (UID: \"8916579d-e13d-4087-8c29-41a8ec210f9a\") " pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.263181 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcjgd\" (UniqueName: \"kubernetes.io/projected/8916579d-e13d-4087-8c29-41a8ec210f9a-kube-api-access-fcjgd\") pod \"placement-db-create-2vh2n\" (UID: \"8916579d-e13d-4087-8c29-41a8ec210f9a\") " pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.281230 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-w957z"] Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.314262 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.345802 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9jsr\" (UniqueName: \"kubernetes.io/projected/c94c30da-2a32-4256-a0a5-13c6c7a54725-kube-api-access-x9jsr\") pod \"placement-1491-account-create-update-f9fqz\" (UID: \"c94c30da-2a32-4256-a0a5-13c6c7a54725\") " pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.345863 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94c30da-2a32-4256-a0a5-13c6c7a54725-operator-scripts\") pod \"placement-1491-account-create-update-f9fqz\" (UID: \"c94c30da-2a32-4256-a0a5-13c6c7a54725\") " pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.412169 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.445082 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b1da-account-create-update-nqw28"] Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.452268 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9jsr\" (UniqueName: \"kubernetes.io/projected/c94c30da-2a32-4256-a0a5-13c6c7a54725-kube-api-access-x9jsr\") pod \"placement-1491-account-create-update-f9fqz\" (UID: \"c94c30da-2a32-4256-a0a5-13c6c7a54725\") " pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.457372 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94c30da-2a32-4256-a0a5-13c6c7a54725-operator-scripts\") pod \"placement-1491-account-create-update-f9fqz\" (UID: \"c94c30da-2a32-4256-a0a5-13c6c7a54725\") " pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.458168 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94c30da-2a32-4256-a0a5-13c6c7a54725-operator-scripts\") pod \"placement-1491-account-create-update-f9fqz\" (UID: \"c94c30da-2a32-4256-a0a5-13c6c7a54725\") " pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.489232 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9jsr\" (UniqueName: \"kubernetes.io/projected/c94c30da-2a32-4256-a0a5-13c6c7a54725-kube-api-access-x9jsr\") pod \"placement-1491-account-create-update-f9fqz\" (UID: \"c94c30da-2a32-4256-a0a5-13c6c7a54725\") " pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:49 crc kubenswrapper[4585]: W0215 17:20:49.500983 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod224730b2_4e83_42a8_b057_11c1ae0fd14f.slice/crio-638abc2f46adce1b6b614c4dc0caf97a280b2fbc889944a135c9bcea68568f9a WatchSource:0}: Error finding container 638abc2f46adce1b6b614c4dc0caf97a280b2fbc889944a135c9bcea68568f9a: Status 404 returned error can't find the container with id 638abc2f46adce1b6b614c4dc0caf97a280b2fbc889944a135c9bcea68568f9a Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.513064 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w957z" event={"ID":"7ab2a2c2-523c-43c3-9776-902776a34ad4","Type":"ContainerStarted","Data":"a0f568d33ec8de950aa8c07025e086849cf605ca1c9d6597dae46c0234c8229f"} Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.517642 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"25a62375-a3a2-44ec-b5e3-e03e3da6257e","Type":"ContainerStarted","Data":"7e9d28665d0396dfd2bad90f4a8b953ebff85118ece9ef4bd2f9b1f453a83398"} Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.517676 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.529649 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-w957z" podStartSLOduration=1.529627852 podStartE2EDuration="1.529627852s" podCreationTimestamp="2026-02-15 17:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:20:49.525581702 +0000 UTC m=+905.468989844" watchObservedRunningTime="2026-02-15 17:20:49.529627852 +0000 UTC m=+905.473035984" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.534820 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.552010 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.200391416 podStartE2EDuration="5.551995821s" podCreationTimestamp="2026-02-15 17:20:44 +0000 UTC" firstStartedPulling="2026-02-15 17:20:46.721309615 +0000 UTC m=+902.664717747" lastFinishedPulling="2026-02-15 17:20:48.07291402 +0000 UTC m=+904.016322152" observedRunningTime="2026-02-15 17:20:49.546511732 +0000 UTC m=+905.489919864" watchObservedRunningTime="2026-02-15 17:20:49.551995821 +0000 UTC m=+905.495403953" Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.730219 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rtgm2"] Feb 15 17:20:49 crc kubenswrapper[4585]: I0215 17:20:49.971657 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fbb4-account-create-update-pc58j"] Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.100482 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hkzs9"] Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.140464 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.157657 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2vh2n"] Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.171949 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8qg92"] Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.173396 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.200539 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8qg92"] Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.262261 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1491-account-create-update-f9fqz"] Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.299540 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-config\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.299796 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.300032 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.300172 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.300268 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh5kw\" (UniqueName: \"kubernetes.io/projected/86b120f0-0567-42fa-b617-69cf1d4854c7-kube-api-access-vh5kw\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.403756 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.403819 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.403861 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.403880 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh5kw\" (UniqueName: \"kubernetes.io/projected/86b120f0-0567-42fa-b617-69cf1d4854c7-kube-api-access-vh5kw\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.403906 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-config\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.405499 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.407113 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-config\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.407137 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.408510 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.432931 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh5kw\" (UniqueName: \"kubernetes.io/projected/86b120f0-0567-42fa-b617-69cf1d4854c7-kube-api-access-vh5kw\") pod \"dnsmasq-dns-b8fbc5445-8qg92\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.552744 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b1da-account-create-update-nqw28" event={"ID":"224730b2-4e83-42a8-b057-11c1ae0fd14f","Type":"ContainerStarted","Data":"6ed929907234f3bf0f8d200fc28e94a9714b1d0fd2a54847a6fb170ec14504b7"} Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.552790 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b1da-account-create-update-nqw28" event={"ID":"224730b2-4e83-42a8-b057-11c1ae0fd14f","Type":"ContainerStarted","Data":"638abc2f46adce1b6b614c4dc0caf97a280b2fbc889944a135c9bcea68568f9a"} Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.559686 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.560079 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2vh2n" event={"ID":"8916579d-e13d-4087-8c29-41a8ec210f9a","Type":"ContainerStarted","Data":"f7f9e5bde28b148ed927a82975bc2fdbcec0fa0bb33d0b2bdbe702d0898023e3"} Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.576458 4585 generic.go:334] "Generic (PLEG): container finished" podID="7ab2a2c2-523c-43c3-9776-902776a34ad4" containerID="1133775a544ecd10893348bd244e644d535aafa7205b58a646fb768055cab0c3" exitCode=0 Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.576533 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w957z" event={"ID":"7ab2a2c2-523c-43c3-9776-902776a34ad4","Type":"ContainerDied","Data":"1133775a544ecd10893348bd244e644d535aafa7205b58a646fb768055cab0c3"} Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.577887 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b1da-account-create-update-nqw28" podStartSLOduration=2.577869828 podStartE2EDuration="2.577869828s" podCreationTimestamp="2026-02-15 17:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:20:50.566854008 +0000 UTC m=+906.510262140" watchObservedRunningTime="2026-02-15 17:20:50.577869828 +0000 UTC m=+906.521277960" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.587836 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fbb4-account-create-update-pc58j" event={"ID":"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2","Type":"ContainerStarted","Data":"65a2dee02e0546d98b0533f6c018b1cbeb0b18c5ac900c86936080810d10cf84"} Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.628637 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rtgm2" event={"ID":"ece8b5dd-b54b-400c-a62a-4995d15e8763","Type":"ContainerStarted","Data":"3c88581b6d975449e8fd5787046dbf15c976a17261ef585a85565ea999732c8a"} Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.628679 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rtgm2" event={"ID":"ece8b5dd-b54b-400c-a62a-4995d15e8763","Type":"ContainerStarted","Data":"6a6636726e535b693680532c3c192e6a679cb1536abbf74f99048e274a9bd24d"} Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.631254 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1491-account-create-update-f9fqz" event={"ID":"c94c30da-2a32-4256-a0a5-13c6c7a54725","Type":"ContainerStarted","Data":"b8e179cd1cd338035d91fbf1ba5ca019fcd63e90d2b12c12c9930c98ab4d97fe"} Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.631614 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" podUID="62bf19dd-0faa-4e78-b944-ef44f34d3a66" containerName="dnsmasq-dns" containerID="cri-o://010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d" gracePeriod=10 Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.667063 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-rtgm2" podStartSLOduration=2.666975157 podStartE2EDuration="2.666975157s" podCreationTimestamp="2026-02-15 17:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:20:50.654043105 +0000 UTC m=+906.597451237" watchObservedRunningTime="2026-02-15 17:20:50.666975157 +0000 UTC m=+906.610383289" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.681388 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-1491-account-create-update-f9fqz" podStartSLOduration=1.68137207 podStartE2EDuration="1.68137207s" podCreationTimestamp="2026-02-15 17:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:20:50.677893995 +0000 UTC m=+906.621302127" watchObservedRunningTime="2026-02-15 17:20:50.68137207 +0000 UTC m=+906.624780202" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.800175 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:50 crc kubenswrapper[4585]: I0215 17:20:50.800208 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:20:51 crc kubenswrapper[4585]: E0215 17:20:51.063355 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62bf19dd_0faa_4e78_b944_ef44f34d3a66.slice/crio-010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d.scope\": RecentStats: unable to find data in memory cache]" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.316820 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.332147 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.336237 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-d6kgx" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.336445 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.336551 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.337762 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.341847 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.359706 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.429803 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c8f627-225d-40ac-827b-d2f3476c1768-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.430065 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp982\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-kube-api-access-fp982\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.430185 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.430337 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/92c8f627-225d-40ac-827b-d2f3476c1768-lock\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.430685 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/92c8f627-225d-40ac-827b-d2f3476c1768-cache\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.430826 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.448653 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.513819 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.531912 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-dns-svc\") pod \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.531967 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tszc8\" (UniqueName: \"kubernetes.io/projected/62bf19dd-0faa-4e78-b944-ef44f34d3a66-kube-api-access-tszc8\") pod \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.532022 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-ovsdbserver-sb\") pod \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.532047 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-config\") pod \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.532274 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/92c8f627-225d-40ac-827b-d2f3476c1768-cache\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.532320 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.532346 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp982\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-kube-api-access-fp982\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.532362 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c8f627-225d-40ac-827b-d2f3476c1768-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.532405 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.532449 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/92c8f627-225d-40ac-827b-d2f3476c1768-lock\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.537719 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/92c8f627-225d-40ac-827b-d2f3476c1768-lock\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.553400 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/92c8f627-225d-40ac-827b-d2f3476c1768-cache\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: E0215 17:20:51.555331 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 15 17:20:51 crc kubenswrapper[4585]: E0215 17:20:51.555382 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 15 17:20:51 crc kubenswrapper[4585]: E0215 17:20:51.555457 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift podName:92c8f627-225d-40ac-827b-d2f3476c1768 nodeName:}" failed. No retries permitted until 2026-02-15 17:20:52.055412117 +0000 UTC m=+907.998820249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift") pod "swift-storage-0" (UID: "92c8f627-225d-40ac-827b-d2f3476c1768") : configmap "swift-ring-files" not found Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.556587 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.587448 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8qg92"] Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.596952 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp982\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-kube-api-access-fp982\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.603069 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62bf19dd-0faa-4e78-b944-ef44f34d3a66-kube-api-access-tszc8" (OuterVolumeSpecName: "kube-api-access-tszc8") pod "62bf19dd-0faa-4e78-b944-ef44f34d3a66" (UID: "62bf19dd-0faa-4e78-b944-ef44f34d3a66"). InnerVolumeSpecName "kube-api-access-tszc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.610211 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c8f627-225d-40ac-827b-d2f3476c1768-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.624076 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.632909 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62bf19dd-0faa-4e78-b944-ef44f34d3a66" (UID: "62bf19dd-0faa-4e78-b944-ef44f34d3a66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.633309 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62bf19dd-0faa-4e78-b944-ef44f34d3a66" (UID: "62bf19dd-0faa-4e78-b944-ef44f34d3a66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.634029 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-dns-svc\") pod \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.634144 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-ovsdbserver-sb\") pod \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\" (UID: \"62bf19dd-0faa-4e78-b944-ef44f34d3a66\") " Feb 15 17:20:51 crc kubenswrapper[4585]: W0215 17:20:51.636040 4585 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/62bf19dd-0faa-4e78-b944-ef44f34d3a66/volumes/kubernetes.io~configmap/dns-svc Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.636115 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62bf19dd-0faa-4e78-b944-ef44f34d3a66" (UID: "62bf19dd-0faa-4e78-b944-ef44f34d3a66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:51 crc kubenswrapper[4585]: W0215 17:20:51.637856 4585 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/62bf19dd-0faa-4e78-b944-ef44f34d3a66/volumes/kubernetes.io~configmap/ovsdbserver-sb Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.637930 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62bf19dd-0faa-4e78-b944-ef44f34d3a66" (UID: "62bf19dd-0faa-4e78-b944-ef44f34d3a66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.638344 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.638386 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.638396 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tszc8\" (UniqueName: \"kubernetes.io/projected/62bf19dd-0faa-4e78-b944-ef44f34d3a66-kube-api-access-tszc8\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.652649 4585 generic.go:334] "Generic (PLEG): container finished" podID="224730b2-4e83-42a8-b057-11c1ae0fd14f" containerID="6ed929907234f3bf0f8d200fc28e94a9714b1d0fd2a54847a6fb170ec14504b7" exitCode=0 Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.652736 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b1da-account-create-update-nqw28" event={"ID":"224730b2-4e83-42a8-b057-11c1ae0fd14f","Type":"ContainerDied","Data":"6ed929907234f3bf0f8d200fc28e94a9714b1d0fd2a54847a6fb170ec14504b7"} Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.654270 4585 generic.go:334] "Generic (PLEG): container finished" podID="8916579d-e13d-4087-8c29-41a8ec210f9a" containerID="af9c794c4169f98d81ca0d6f6d4ff4e1c56217fa801b953e97b3e4902c982cef" exitCode=0 Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.654382 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2vh2n" event={"ID":"8916579d-e13d-4087-8c29-41a8ec210f9a","Type":"ContainerDied","Data":"af9c794c4169f98d81ca0d6f6d4ff4e1c56217fa801b953e97b3e4902c982cef"} Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.657174 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-config" (OuterVolumeSpecName: "config") pod "62bf19dd-0faa-4e78-b944-ef44f34d3a66" (UID: "62bf19dd-0faa-4e78-b944-ef44f34d3a66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.661638 4585 generic.go:334] "Generic (PLEG): container finished" podID="62bf19dd-0faa-4e78-b944-ef44f34d3a66" containerID="010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d" exitCode=0 Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.661693 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" event={"ID":"62bf19dd-0faa-4e78-b944-ef44f34d3a66","Type":"ContainerDied","Data":"010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d"} Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.661721 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" event={"ID":"62bf19dd-0faa-4e78-b944-ef44f34d3a66","Type":"ContainerDied","Data":"1458203a2044069639071acbcfce48d9234ed63d9a25278c76cda8d2e93d747c"} Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.661737 4585 scope.go:117] "RemoveContainer" containerID="010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.661851 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hkzs9" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.663374 4585 generic.go:334] "Generic (PLEG): container finished" podID="e26f1cb2-7405-4d6e-bbde-c172c4bc32b2" containerID="04d960b801b51e4c7dcf70018d28226345910bcd3c4ca3d02660bd3e81871f31" exitCode=0 Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.663525 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fbb4-account-create-update-pc58j" event={"ID":"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2","Type":"ContainerDied","Data":"04d960b801b51e4c7dcf70018d28226345910bcd3c4ca3d02660bd3e81871f31"} Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.679550 4585 generic.go:334] "Generic (PLEG): container finished" podID="ece8b5dd-b54b-400c-a62a-4995d15e8763" containerID="3c88581b6d975449e8fd5787046dbf15c976a17261ef585a85565ea999732c8a" exitCode=0 Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.679675 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rtgm2" event={"ID":"ece8b5dd-b54b-400c-a62a-4995d15e8763","Type":"ContainerDied","Data":"3c88581b6d975449e8fd5787046dbf15c976a17261ef585a85565ea999732c8a"} Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.691667 4585 generic.go:334] "Generic (PLEG): container finished" podID="c94c30da-2a32-4256-a0a5-13c6c7a54725" containerID="4750d1ecd741fbda12fb7fb1a39c13166055c77a7c78006eb78b482e6bd7d111" exitCode=0 Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.691725 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1491-account-create-update-f9fqz" event={"ID":"c94c30da-2a32-4256-a0a5-13c6c7a54725","Type":"ContainerDied","Data":"4750d1ecd741fbda12fb7fb1a39c13166055c77a7c78006eb78b482e6bd7d111"} Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.695410 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" event={"ID":"86b120f0-0567-42fa-b617-69cf1d4854c7","Type":"ContainerStarted","Data":"223babbfa2e6ad1335bc7da2d67e8034bd57318541cae878e3794930acc0f1ea"} Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.740255 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bf19dd-0faa-4e78-b944-ef44f34d3a66-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.764207 4585 scope.go:117] "RemoveContainer" containerID="336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.802019 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hkzs9"] Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.810279 4585 scope.go:117] "RemoveContainer" containerID="010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d" Feb 15 17:20:51 crc kubenswrapper[4585]: E0215 17:20:51.811152 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d\": container with ID starting with 010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d not found: ID does not exist" containerID="010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.811213 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d"} err="failed to get container status \"010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d\": rpc error: code = NotFound desc = could not find container \"010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d\": container with ID starting with 010b3b51bb751ef69064a92b2030abe98e2032771032e8944336c5010bec1c6d not found: ID does not exist" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.811236 4585 scope.go:117] "RemoveContainer" containerID="336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135" Feb 15 17:20:51 crc kubenswrapper[4585]: E0215 17:20:51.811746 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135\": container with ID starting with 336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135 not found: ID does not exist" containerID="336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.811777 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135"} err="failed to get container status \"336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135\": rpc error: code = NotFound desc = could not find container \"336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135\": container with ID starting with 336d93fd50866b5210176a05e48d589c118034a4bb4dc0f19ae35a798026b135 not found: ID does not exist" Feb 15 17:20:51 crc kubenswrapper[4585]: I0215 17:20:51.813543 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hkzs9"] Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.009788 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8jr8n" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerName="registry-server" probeResult="failure" output=< Feb 15 17:20:52 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:20:52 crc kubenswrapper[4585]: > Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.067319 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:52 crc kubenswrapper[4585]: E0215 17:20:52.067713 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 15 17:20:52 crc kubenswrapper[4585]: E0215 17:20:52.067729 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 15 17:20:52 crc kubenswrapper[4585]: E0215 17:20:52.067789 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift podName:92c8f627-225d-40ac-827b-d2f3476c1768 nodeName:}" failed. No retries permitted until 2026-02-15 17:20:53.067775834 +0000 UTC m=+909.011183966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift") pod "swift-storage-0" (UID: "92c8f627-225d-40ac-827b-d2f3476c1768") : configmap "swift-ring-files" not found Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.104887 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hklp2"] Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.107966 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w957z" Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.278005 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x4pf\" (UniqueName: \"kubernetes.io/projected/7ab2a2c2-523c-43c3-9776-902776a34ad4-kube-api-access-8x4pf\") pod \"7ab2a2c2-523c-43c3-9776-902776a34ad4\" (UID: \"7ab2a2c2-523c-43c3-9776-902776a34ad4\") " Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.278365 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ab2a2c2-523c-43c3-9776-902776a34ad4-operator-scripts\") pod \"7ab2a2c2-523c-43c3-9776-902776a34ad4\" (UID: \"7ab2a2c2-523c-43c3-9776-902776a34ad4\") " Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.278827 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ab2a2c2-523c-43c3-9776-902776a34ad4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ab2a2c2-523c-43c3-9776-902776a34ad4" (UID: "7ab2a2c2-523c-43c3-9776-902776a34ad4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.282848 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab2a2c2-523c-43c3-9776-902776a34ad4-kube-api-access-8x4pf" (OuterVolumeSpecName: "kube-api-access-8x4pf") pod "7ab2a2c2-523c-43c3-9776-902776a34ad4" (UID: "7ab2a2c2-523c-43c3-9776-902776a34ad4"). InnerVolumeSpecName "kube-api-access-8x4pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.380628 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x4pf\" (UniqueName: \"kubernetes.io/projected/7ab2a2c2-523c-43c3-9776-902776a34ad4-kube-api-access-8x4pf\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.380660 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ab2a2c2-523c-43c3-9776-902776a34ad4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.707498 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-w957z" event={"ID":"7ab2a2c2-523c-43c3-9776-902776a34ad4","Type":"ContainerDied","Data":"a0f568d33ec8de950aa8c07025e086849cf605ca1c9d6597dae46c0234c8229f"} Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.707569 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0f568d33ec8de950aa8c07025e086849cf605ca1c9d6597dae46c0234c8229f" Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.707514 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-w957z" Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.709850 4585 generic.go:334] "Generic (PLEG): container finished" podID="86b120f0-0567-42fa-b617-69cf1d4854c7" containerID="a6634c0ef1a1562948706cf432145b94cb95b18b28a2970776b677077b76b153" exitCode=0 Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.709958 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" event={"ID":"86b120f0-0567-42fa-b617-69cf1d4854c7","Type":"ContainerDied","Data":"a6634c0ef1a1562948706cf432145b94cb95b18b28a2970776b677077b76b153"} Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.710654 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hklp2" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerName="registry-server" containerID="cri-o://faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1" gracePeriod=2 Feb 15 17:20:52 crc kubenswrapper[4585]: I0215 17:20:52.894690 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62bf19dd-0faa-4e78-b944-ef44f34d3a66" path="/var/lib/kubelet/pods/62bf19dd-0faa-4e78-b944-ef44f34d3a66/volumes" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.094556 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:53 crc kubenswrapper[4585]: E0215 17:20:53.094857 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 15 17:20:53 crc kubenswrapper[4585]: E0215 17:20:53.094871 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 15 17:20:53 crc kubenswrapper[4585]: E0215 17:20:53.094913 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift podName:92c8f627-225d-40ac-827b-d2f3476c1768 nodeName:}" failed. No retries permitted until 2026-02-15 17:20:55.094900605 +0000 UTC m=+911.038308737 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift") pod "swift-storage-0" (UID: "92c8f627-225d-40ac-827b-d2f3476c1768") : configmap "swift-ring-files" not found Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.182903 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.297722 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcjgd\" (UniqueName: \"kubernetes.io/projected/8916579d-e13d-4087-8c29-41a8ec210f9a-kube-api-access-fcjgd\") pod \"8916579d-e13d-4087-8c29-41a8ec210f9a\" (UID: \"8916579d-e13d-4087-8c29-41a8ec210f9a\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.298264 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8916579d-e13d-4087-8c29-41a8ec210f9a-operator-scripts\") pod \"8916579d-e13d-4087-8c29-41a8ec210f9a\" (UID: \"8916579d-e13d-4087-8c29-41a8ec210f9a\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.299193 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8916579d-e13d-4087-8c29-41a8ec210f9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8916579d-e13d-4087-8c29-41a8ec210f9a" (UID: "8916579d-e13d-4087-8c29-41a8ec210f9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.332933 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8916579d-e13d-4087-8c29-41a8ec210f9a-kube-api-access-fcjgd" (OuterVolumeSpecName: "kube-api-access-fcjgd") pod "8916579d-e13d-4087-8c29-41a8ec210f9a" (UID: "8916579d-e13d-4087-8c29-41a8ec210f9a"). InnerVolumeSpecName "kube-api-access-fcjgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.345016 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.365253 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.382390 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.400135 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8916579d-e13d-4087-8c29-41a8ec210f9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.400159 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcjgd\" (UniqueName: \"kubernetes.io/projected/8916579d-e13d-4087-8c29-41a8ec210f9a-kube-api-access-fcjgd\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.407341 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.500758 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94c30da-2a32-4256-a0a5-13c6c7a54725-operator-scripts\") pod \"c94c30da-2a32-4256-a0a5-13c6c7a54725\" (UID: \"c94c30da-2a32-4256-a0a5-13c6c7a54725\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.500841 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wbjz\" (UniqueName: \"kubernetes.io/projected/224730b2-4e83-42a8-b057-11c1ae0fd14f-kube-api-access-7wbjz\") pod \"224730b2-4e83-42a8-b057-11c1ae0fd14f\" (UID: \"224730b2-4e83-42a8-b057-11c1ae0fd14f\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.500877 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece8b5dd-b54b-400c-a62a-4995d15e8763-operator-scripts\") pod \"ece8b5dd-b54b-400c-a62a-4995d15e8763\" (UID: \"ece8b5dd-b54b-400c-a62a-4995d15e8763\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.500925 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2dbc\" (UniqueName: \"kubernetes.io/projected/ece8b5dd-b54b-400c-a62a-4995d15e8763-kube-api-access-j2dbc\") pod \"ece8b5dd-b54b-400c-a62a-4995d15e8763\" (UID: \"ece8b5dd-b54b-400c-a62a-4995d15e8763\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.500947 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-operator-scripts\") pod \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\" (UID: \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.500993 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224730b2-4e83-42a8-b057-11c1ae0fd14f-operator-scripts\") pod \"224730b2-4e83-42a8-b057-11c1ae0fd14f\" (UID: \"224730b2-4e83-42a8-b057-11c1ae0fd14f\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.501024 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2bs6\" (UniqueName: \"kubernetes.io/projected/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-kube-api-access-f2bs6\") pod \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\" (UID: \"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.501113 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9jsr\" (UniqueName: \"kubernetes.io/projected/c94c30da-2a32-4256-a0a5-13c6c7a54725-kube-api-access-x9jsr\") pod \"c94c30da-2a32-4256-a0a5-13c6c7a54725\" (UID: \"c94c30da-2a32-4256-a0a5-13c6c7a54725\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.505758 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94c30da-2a32-4256-a0a5-13c6c7a54725-kube-api-access-x9jsr" (OuterVolumeSpecName: "kube-api-access-x9jsr") pod "c94c30da-2a32-4256-a0a5-13c6c7a54725" (UID: "c94c30da-2a32-4256-a0a5-13c6c7a54725"). InnerVolumeSpecName "kube-api-access-x9jsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.505893 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e26f1cb2-7405-4d6e-bbde-c172c4bc32b2" (UID: "e26f1cb2-7405-4d6e-bbde-c172c4bc32b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.506148 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ece8b5dd-b54b-400c-a62a-4995d15e8763-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ece8b5dd-b54b-400c-a62a-4995d15e8763" (UID: "ece8b5dd-b54b-400c-a62a-4995d15e8763"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.506377 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c94c30da-2a32-4256-a0a5-13c6c7a54725-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c94c30da-2a32-4256-a0a5-13c6c7a54725" (UID: "c94c30da-2a32-4256-a0a5-13c6c7a54725"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.506423 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/224730b2-4e83-42a8-b057-11c1ae0fd14f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "224730b2-4e83-42a8-b057-11c1ae0fd14f" (UID: "224730b2-4e83-42a8-b057-11c1ae0fd14f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.507423 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece8b5dd-b54b-400c-a62a-4995d15e8763-kube-api-access-j2dbc" (OuterVolumeSpecName: "kube-api-access-j2dbc") pod "ece8b5dd-b54b-400c-a62a-4995d15e8763" (UID: "ece8b5dd-b54b-400c-a62a-4995d15e8763"). InnerVolumeSpecName "kube-api-access-j2dbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.509773 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-kube-api-access-f2bs6" (OuterVolumeSpecName: "kube-api-access-f2bs6") pod "e26f1cb2-7405-4d6e-bbde-c172c4bc32b2" (UID: "e26f1cb2-7405-4d6e-bbde-c172c4bc32b2"). InnerVolumeSpecName "kube-api-access-f2bs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.509816 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/224730b2-4e83-42a8-b057-11c1ae0fd14f-kube-api-access-7wbjz" (OuterVolumeSpecName: "kube-api-access-7wbjz") pod "224730b2-4e83-42a8-b057-11c1ae0fd14f" (UID: "224730b2-4e83-42a8-b057-11c1ae0fd14f"). InnerVolumeSpecName "kube-api-access-7wbjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.562649 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.602955 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/224730b2-4e83-42a8-b057-11c1ae0fd14f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.602985 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2bs6\" (UniqueName: \"kubernetes.io/projected/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-kube-api-access-f2bs6\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.602995 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9jsr\" (UniqueName: \"kubernetes.io/projected/c94c30da-2a32-4256-a0a5-13c6c7a54725-kube-api-access-x9jsr\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.603007 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c94c30da-2a32-4256-a0a5-13c6c7a54725-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.603017 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wbjz\" (UniqueName: \"kubernetes.io/projected/224730b2-4e83-42a8-b057-11c1ae0fd14f-kube-api-access-7wbjz\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.603027 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece8b5dd-b54b-400c-a62a-4995d15e8763-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.603038 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2dbc\" (UniqueName: \"kubernetes.io/projected/ece8b5dd-b54b-400c-a62a-4995d15e8763-kube-api-access-j2dbc\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.603047 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.703896 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-utilities\") pod \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.703990 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-catalog-content\") pod \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.704177 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzfq2\" (UniqueName: \"kubernetes.io/projected/89311b3a-05ab-45c7-a6df-e8f671bdab4e-kube-api-access-fzfq2\") pod \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\" (UID: \"89311b3a-05ab-45c7-a6df-e8f671bdab4e\") " Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.705112 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-utilities" (OuterVolumeSpecName: "utilities") pod "89311b3a-05ab-45c7-a6df-e8f671bdab4e" (UID: "89311b3a-05ab-45c7-a6df-e8f671bdab4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.714139 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89311b3a-05ab-45c7-a6df-e8f671bdab4e-kube-api-access-fzfq2" (OuterVolumeSpecName: "kube-api-access-fzfq2") pod "89311b3a-05ab-45c7-a6df-e8f671bdab4e" (UID: "89311b3a-05ab-45c7-a6df-e8f671bdab4e"). InnerVolumeSpecName "kube-api-access-fzfq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.728546 4585 generic.go:334] "Generic (PLEG): container finished" podID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerID="faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1" exitCode=0 Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.728832 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hklp2" event={"ID":"89311b3a-05ab-45c7-a6df-e8f671bdab4e","Type":"ContainerDied","Data":"faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1"} Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.729161 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hklp2" event={"ID":"89311b3a-05ab-45c7-a6df-e8f671bdab4e","Type":"ContainerDied","Data":"2335b94b2d64cf0ba289b1e6ef597a6aa7d10f0303163d0f7df47c4aa978f412"} Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.729249 4585 scope.go:117] "RemoveContainer" containerID="faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.728932 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hklp2" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.730038 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89311b3a-05ab-45c7-a6df-e8f671bdab4e" (UID: "89311b3a-05ab-45c7-a6df-e8f671bdab4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.754031 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rtgm2" event={"ID":"ece8b5dd-b54b-400c-a62a-4995d15e8763","Type":"ContainerDied","Data":"6a6636726e535b693680532c3c192e6a679cb1536abbf74f99048e274a9bd24d"} Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.754058 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a6636726e535b693680532c3c192e6a679cb1536abbf74f99048e274a9bd24d" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.754108 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rtgm2" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.768690 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1491-account-create-update-f9fqz" event={"ID":"c94c30da-2a32-4256-a0a5-13c6c7a54725","Type":"ContainerDied","Data":"b8e179cd1cd338035d91fbf1ba5ca019fcd63e90d2b12c12c9930c98ab4d97fe"} Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.768771 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8e179cd1cd338035d91fbf1ba5ca019fcd63e90d2b12c12c9930c98ab4d97fe" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.768880 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1491-account-create-update-f9fqz" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.772245 4585 scope.go:117] "RemoveContainer" containerID="ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.784804 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" event={"ID":"86b120f0-0567-42fa-b617-69cf1d4854c7","Type":"ContainerStarted","Data":"445760de021578cfbcb54d581b7cb9c8735b50399d6b0c66343381a2dff920a7"} Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.785660 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.794569 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b1da-account-create-update-nqw28" event={"ID":"224730b2-4e83-42a8-b057-11c1ae0fd14f","Type":"ContainerDied","Data":"638abc2f46adce1b6b614c4dc0caf97a280b2fbc889944a135c9bcea68568f9a"} Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.794623 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="638abc2f46adce1b6b614c4dc0caf97a280b2fbc889944a135c9bcea68568f9a" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.794579 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b1da-account-create-update-nqw28" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.796706 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2vh2n" event={"ID":"8916579d-e13d-4087-8c29-41a8ec210f9a","Type":"ContainerDied","Data":"f7f9e5bde28b148ed927a82975bc2fdbcec0fa0bb33d0b2bdbe702d0898023e3"} Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.796815 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7f9e5bde28b148ed927a82975bc2fdbcec0fa0bb33d0b2bdbe702d0898023e3" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.796906 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2vh2n" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.803897 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" podStartSLOduration=3.803884052 podStartE2EDuration="3.803884052s" podCreationTimestamp="2026-02-15 17:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:20:53.803524182 +0000 UTC m=+909.746932314" watchObservedRunningTime="2026-02-15 17:20:53.803884052 +0000 UTC m=+909.747292184" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.817068 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fbb4-account-create-update-pc58j" event={"ID":"e26f1cb2-7405-4d6e-bbde-c172c4bc32b2","Type":"ContainerDied","Data":"65a2dee02e0546d98b0533f6c018b1cbeb0b18c5ac900c86936080810d10cf84"} Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.817113 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65a2dee02e0546d98b0533f6c018b1cbeb0b18c5ac900c86936080810d10cf84" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.817831 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fbb4-account-create-update-pc58j" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.823207 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.825328 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzfq2\" (UniqueName: \"kubernetes.io/projected/89311b3a-05ab-45c7-a6df-e8f671bdab4e-kube-api-access-fzfq2\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.825360 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89311b3a-05ab-45c7-a6df-e8f671bdab4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.828800 4585 scope.go:117] "RemoveContainer" containerID="87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.881977 4585 scope.go:117] "RemoveContainer" containerID="faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1" Feb 15 17:20:53 crc kubenswrapper[4585]: E0215 17:20:53.884528 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1\": container with ID starting with faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1 not found: ID does not exist" containerID="faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.884582 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1"} err="failed to get container status \"faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1\": rpc error: code = NotFound desc = could not find container \"faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1\": container with ID starting with faf1aa83952c411c1042a57699f723a4b56ecb6488aa1dbc253ab1187755c9f1 not found: ID does not exist" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.884617 4585 scope.go:117] "RemoveContainer" containerID="ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad" Feb 15 17:20:53 crc kubenswrapper[4585]: E0215 17:20:53.886141 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad\": container with ID starting with ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad not found: ID does not exist" containerID="ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.886187 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad"} err="failed to get container status \"ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad\": rpc error: code = NotFound desc = could not find container \"ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad\": container with ID starting with ffbea6ff0153091b155eef1ef73a70461876637c7d834bcc2116465f6bfdadad not found: ID does not exist" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.886203 4585 scope.go:117] "RemoveContainer" containerID="87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5" Feb 15 17:20:53 crc kubenswrapper[4585]: E0215 17:20:53.886706 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5\": container with ID starting with 87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5 not found: ID does not exist" containerID="87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5" Feb 15 17:20:53 crc kubenswrapper[4585]: I0215 17:20:53.886730 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5"} err="failed to get container status \"87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5\": rpc error: code = NotFound desc = could not find container \"87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5\": container with ID starting with 87fe825be8913c60db57a004d3debb9e4be09318a45bec21dc27c7a5d528a0b5 not found: ID does not exist" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.063052 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hklp2"] Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.074682 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hklp2"] Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.795727 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-sk4vl"] Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805368 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62bf19dd-0faa-4e78-b944-ef44f34d3a66" containerName="dnsmasq-dns" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805398 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="62bf19dd-0faa-4e78-b944-ef44f34d3a66" containerName="dnsmasq-dns" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805407 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerName="registry-server" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805413 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerName="registry-server" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805433 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94c30da-2a32-4256-a0a5-13c6c7a54725" containerName="mariadb-account-create-update" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805439 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94c30da-2a32-4256-a0a5-13c6c7a54725" containerName="mariadb-account-create-update" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805449 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224730b2-4e83-42a8-b057-11c1ae0fd14f" containerName="mariadb-account-create-update" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805456 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="224730b2-4e83-42a8-b057-11c1ae0fd14f" containerName="mariadb-account-create-update" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805479 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26f1cb2-7405-4d6e-bbde-c172c4bc32b2" containerName="mariadb-account-create-update" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805485 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26f1cb2-7405-4d6e-bbde-c172c4bc32b2" containerName="mariadb-account-create-update" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805497 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8916579d-e13d-4087-8c29-41a8ec210f9a" containerName="mariadb-database-create" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805503 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8916579d-e13d-4087-8c29-41a8ec210f9a" containerName="mariadb-database-create" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805510 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece8b5dd-b54b-400c-a62a-4995d15e8763" containerName="mariadb-database-create" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805516 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece8b5dd-b54b-400c-a62a-4995d15e8763" containerName="mariadb-database-create" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805527 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerName="extract-utilities" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805533 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerName="extract-utilities" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805542 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62bf19dd-0faa-4e78-b944-ef44f34d3a66" containerName="init" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805548 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="62bf19dd-0faa-4e78-b944-ef44f34d3a66" containerName="init" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805558 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerName="extract-content" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805563 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerName="extract-content" Feb 15 17:20:54 crc kubenswrapper[4585]: E0215 17:20:54.805576 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab2a2c2-523c-43c3-9776-902776a34ad4" containerName="mariadb-database-create" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805581 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab2a2c2-523c-43c3-9776-902776a34ad4" containerName="mariadb-database-create" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805764 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece8b5dd-b54b-400c-a62a-4995d15e8763" containerName="mariadb-database-create" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805787 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="224730b2-4e83-42a8-b057-11c1ae0fd14f" containerName="mariadb-account-create-update" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805801 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" containerName="registry-server" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805817 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab2a2c2-523c-43c3-9776-902776a34ad4" containerName="mariadb-database-create" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805825 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94c30da-2a32-4256-a0a5-13c6c7a54725" containerName="mariadb-account-create-update" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805835 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8916579d-e13d-4087-8c29-41a8ec210f9a" containerName="mariadb-database-create" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805845 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26f1cb2-7405-4d6e-bbde-c172c4bc32b2" containerName="mariadb-account-create-update" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.805856 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="62bf19dd-0faa-4e78-b944-ef44f34d3a66" containerName="dnsmasq-dns" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.806345 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sk4vl"] Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.806412 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.834653 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.847026 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fdc\" (UniqueName: \"kubernetes.io/projected/706054f0-aa3c-4907-bca2-864e673da1d7-kube-api-access-b8fdc\") pod \"root-account-create-update-sk4vl\" (UID: \"706054f0-aa3c-4907-bca2-864e673da1d7\") " pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.847104 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706054f0-aa3c-4907-bca2-864e673da1d7-operator-scripts\") pod \"root-account-create-update-sk4vl\" (UID: \"706054f0-aa3c-4907-bca2-864e673da1d7\") " pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.856725 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89311b3a-05ab-45c7-a6df-e8f671bdab4e" path="/var/lib/kubelet/pods/89311b3a-05ab-45c7-a6df-e8f671bdab4e/volumes" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.948233 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fdc\" (UniqueName: \"kubernetes.io/projected/706054f0-aa3c-4907-bca2-864e673da1d7-kube-api-access-b8fdc\") pod \"root-account-create-update-sk4vl\" (UID: \"706054f0-aa3c-4907-bca2-864e673da1d7\") " pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.948320 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706054f0-aa3c-4907-bca2-864e673da1d7-operator-scripts\") pod \"root-account-create-update-sk4vl\" (UID: \"706054f0-aa3c-4907-bca2-864e673da1d7\") " pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.949132 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706054f0-aa3c-4907-bca2-864e673da1d7-operator-scripts\") pod \"root-account-create-update-sk4vl\" (UID: \"706054f0-aa3c-4907-bca2-864e673da1d7\") " pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:54 crc kubenswrapper[4585]: I0215 17:20:54.971259 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fdc\" (UniqueName: \"kubernetes.io/projected/706054f0-aa3c-4907-bca2-864e673da1d7-kube-api-access-b8fdc\") pod \"root-account-create-update-sk4vl\" (UID: \"706054f0-aa3c-4907-bca2-864e673da1d7\") " pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.146509 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.151407 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:55 crc kubenswrapper[4585]: E0215 17:20:55.151656 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 15 17:20:55 crc kubenswrapper[4585]: E0215 17:20:55.151680 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 15 17:20:55 crc kubenswrapper[4585]: E0215 17:20:55.151744 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift podName:92c8f627-225d-40ac-827b-d2f3476c1768 nodeName:}" failed. No retries permitted until 2026-02-15 17:20:59.151725045 +0000 UTC m=+915.095133177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift") pod "swift-storage-0" (UID: "92c8f627-225d-40ac-827b-d2f3476c1768") : configmap "swift-ring-files" not found Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.155563 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-6ls6x"] Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.156735 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.159324 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.159647 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.159812 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.220757 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6ls6x"] Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.258976 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-ring-data-devices\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.259213 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-dispersionconf\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.259246 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzwhl\" (UniqueName: \"kubernetes.io/projected/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-kube-api-access-xzwhl\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.259266 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-scripts\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.259300 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-etc-swift\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.259331 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-swiftconf\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.259377 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-combined-ca-bundle\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.361619 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-ring-data-devices\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.361704 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-dispersionconf\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.361727 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzwhl\" (UniqueName: \"kubernetes.io/projected/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-kube-api-access-xzwhl\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.361766 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-scripts\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.361787 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-etc-swift\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.361818 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-swiftconf\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.362680 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-ring-data-devices\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.362772 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-combined-ca-bundle\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.363151 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-scripts\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.363324 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-etc-swift\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.369410 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-combined-ca-bundle\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.370141 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-dispersionconf\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.370194 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-swiftconf\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.378309 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzwhl\" (UniqueName: \"kubernetes.io/projected/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-kube-api-access-xzwhl\") pod \"swift-ring-rebalance-6ls6x\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.460183 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-sk4vl"] Feb 15 17:20:55 crc kubenswrapper[4585]: W0215 17:20:55.462679 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706054f0_aa3c_4907_bca2_864e673da1d7.slice/crio-e3dd89ebadce118ef0cf7dbfcf066d7314bf06871a2b7cce72a02605cb1be980 WatchSource:0}: Error finding container e3dd89ebadce118ef0cf7dbfcf066d7314bf06871a2b7cce72a02605cb1be980: Status 404 returned error can't find the container with id e3dd89ebadce118ef0cf7dbfcf066d7314bf06871a2b7cce72a02605cb1be980 Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.463830 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.609335 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.859217 4585 generic.go:334] "Generic (PLEG): container finished" podID="706054f0-aa3c-4907-bca2-864e673da1d7" containerID="362dd094c9efa09227c6baee153a70b11e42b2e27e174b9f80c8edb9192ffcf7" exitCode=0 Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.859343 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sk4vl" event={"ID":"706054f0-aa3c-4907-bca2-864e673da1d7","Type":"ContainerDied","Data":"362dd094c9efa09227c6baee153a70b11e42b2e27e174b9f80c8edb9192ffcf7"} Feb 15 17:20:55 crc kubenswrapper[4585]: I0215 17:20:55.859413 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sk4vl" event={"ID":"706054f0-aa3c-4907-bca2-864e673da1d7","Type":"ContainerStarted","Data":"e3dd89ebadce118ef0cf7dbfcf066d7314bf06871a2b7cce72a02605cb1be980"} Feb 15 17:20:56 crc kubenswrapper[4585]: I0215 17:20:56.058305 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6ls6x"] Feb 15 17:20:56 crc kubenswrapper[4585]: I0215 17:20:56.870289 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6ls6x" event={"ID":"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6","Type":"ContainerStarted","Data":"86b40e8c76cc7a1eb56be6aea205b538c51af4c23d2d2451eeff9a7503c00ca2"} Feb 15 17:20:56 crc kubenswrapper[4585]: I0215 17:20:56.930291 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4z4g6"] Feb 15 17:20:56 crc kubenswrapper[4585]: I0215 17:20:56.932492 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:56 crc kubenswrapper[4585]: I0215 17:20:56.942585 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4z4g6"] Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.007217 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-catalog-content\") pod \"certified-operators-4z4g6\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.007264 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t48j4\" (UniqueName: \"kubernetes.io/projected/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-kube-api-access-t48j4\") pod \"certified-operators-4z4g6\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.007309 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-utilities\") pod \"certified-operators-4z4g6\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.108484 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-catalog-content\") pod \"certified-operators-4z4g6\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.108532 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t48j4\" (UniqueName: \"kubernetes.io/projected/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-kube-api-access-t48j4\") pod \"certified-operators-4z4g6\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.108579 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-utilities\") pod \"certified-operators-4z4g6\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.109031 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-utilities\") pod \"certified-operators-4z4g6\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.109770 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-catalog-content\") pod \"certified-operators-4z4g6\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.152739 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t48j4\" (UniqueName: \"kubernetes.io/projected/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-kube-api-access-t48j4\") pod \"certified-operators-4z4g6\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.267613 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.499080 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.645770 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706054f0-aa3c-4907-bca2-864e673da1d7-operator-scripts\") pod \"706054f0-aa3c-4907-bca2-864e673da1d7\" (UID: \"706054f0-aa3c-4907-bca2-864e673da1d7\") " Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.645848 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8fdc\" (UniqueName: \"kubernetes.io/projected/706054f0-aa3c-4907-bca2-864e673da1d7-kube-api-access-b8fdc\") pod \"706054f0-aa3c-4907-bca2-864e673da1d7\" (UID: \"706054f0-aa3c-4907-bca2-864e673da1d7\") " Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.647274 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706054f0-aa3c-4907-bca2-864e673da1d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "706054f0-aa3c-4907-bca2-864e673da1d7" (UID: "706054f0-aa3c-4907-bca2-864e673da1d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.653655 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706054f0-aa3c-4907-bca2-864e673da1d7-kube-api-access-b8fdc" (OuterVolumeSpecName: "kube-api-access-b8fdc") pod "706054f0-aa3c-4907-bca2-864e673da1d7" (UID: "706054f0-aa3c-4907-bca2-864e673da1d7"). InnerVolumeSpecName "kube-api-access-b8fdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.747982 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8fdc\" (UniqueName: \"kubernetes.io/projected/706054f0-aa3c-4907-bca2-864e673da1d7-kube-api-access-b8fdc\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.748010 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/706054f0-aa3c-4907-bca2-864e673da1d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.838042 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4z4g6"] Feb 15 17:20:57 crc kubenswrapper[4585]: W0215 17:20:57.843619 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd50173d4_2ad6_4d52_92cb_e357aaa1e8ac.slice/crio-4bd7bb775bad204a7a929d08e1ad1848998e83c914b077f74b20f4a34cd499f0 WatchSource:0}: Error finding container 4bd7bb775bad204a7a929d08e1ad1848998e83c914b077f74b20f4a34cd499f0: Status 404 returned error can't find the container with id 4bd7bb775bad204a7a929d08e1ad1848998e83c914b077f74b20f4a34cd499f0 Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.881118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-sk4vl" event={"ID":"706054f0-aa3c-4907-bca2-864e673da1d7","Type":"ContainerDied","Data":"e3dd89ebadce118ef0cf7dbfcf066d7314bf06871a2b7cce72a02605cb1be980"} Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.881160 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3dd89ebadce118ef0cf7dbfcf066d7314bf06871a2b7cce72a02605cb1be980" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.881232 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-sk4vl" Feb 15 17:20:57 crc kubenswrapper[4585]: I0215 17:20:57.896224 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4z4g6" event={"ID":"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac","Type":"ContainerStarted","Data":"4bd7bb775bad204a7a929d08e1ad1848998e83c914b077f74b20f4a34cd499f0"} Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.604269 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7nlr5"] Feb 15 17:20:58 crc kubenswrapper[4585]: E0215 17:20:58.605036 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706054f0-aa3c-4907-bca2-864e673da1d7" containerName="mariadb-account-create-update" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.605058 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="706054f0-aa3c-4907-bca2-864e673da1d7" containerName="mariadb-account-create-update" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.605276 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="706054f0-aa3c-4907-bca2-864e673da1d7" containerName="mariadb-account-create-update" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.605833 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.608116 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-trff6" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.608172 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.616679 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7nlr5"] Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.663707 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-db-sync-config-data\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.663757 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-config-data\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.663791 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp2rv\" (UniqueName: \"kubernetes.io/projected/d959ed97-3c6f-4503-864e-57104658b927-kube-api-access-dp2rv\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.663841 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-combined-ca-bundle\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.765112 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-db-sync-config-data\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.765166 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-config-data\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.765204 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp2rv\" (UniqueName: \"kubernetes.io/projected/d959ed97-3c6f-4503-864e-57104658b927-kube-api-access-dp2rv\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.765258 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-combined-ca-bundle\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.770256 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-config-data\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.770291 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-combined-ca-bundle\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.771809 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-db-sync-config-data\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.780868 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp2rv\" (UniqueName: \"kubernetes.io/projected/d959ed97-3c6f-4503-864e-57104658b927-kube-api-access-dp2rv\") pod \"glance-db-sync-7nlr5\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.909346 4585 generic.go:334] "Generic (PLEG): container finished" podID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerID="8aa4fe64ca62dadc07d5818e79a7db0e6fb71fb3a84d485f2c7df6711af125df" exitCode=0 Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.909390 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4z4g6" event={"ID":"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac","Type":"ContainerDied","Data":"8aa4fe64ca62dadc07d5818e79a7db0e6fb71fb3a84d485f2c7df6711af125df"} Feb 15 17:20:58 crc kubenswrapper[4585]: I0215 17:20:58.930957 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7nlr5" Feb 15 17:20:59 crc kubenswrapper[4585]: I0215 17:20:59.172793 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:20:59 crc kubenswrapper[4585]: E0215 17:20:59.173008 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 15 17:20:59 crc kubenswrapper[4585]: E0215 17:20:59.173042 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 15 17:20:59 crc kubenswrapper[4585]: E0215 17:20:59.173108 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift podName:92c8f627-225d-40ac-827b-d2f3476c1768 nodeName:}" failed. No retries permitted until 2026-02-15 17:21:07.173087251 +0000 UTC m=+923.116495383 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift") pod "swift-storage-0" (UID: "92c8f627-225d-40ac-827b-d2f3476c1768") : configmap "swift-ring-files" not found Feb 15 17:21:00 crc kubenswrapper[4585]: I0215 17:21:00.561514 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:21:00 crc kubenswrapper[4585]: I0215 17:21:00.640984 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ds72g"] Feb 15 17:21:00 crc kubenswrapper[4585]: I0215 17:21:00.641207 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-ds72g" podUID="2b2bae3d-b010-43cc-ad44-5573802b4517" containerName="dnsmasq-dns" containerID="cri-o://8398705ed3c434e0f19b2d22e75c60b15b7c5aeef5327e7e52380ac7257c20bc" gracePeriod=10 Feb 15 17:21:00 crc kubenswrapper[4585]: I0215 17:21:00.853830 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:21:00 crc kubenswrapper[4585]: I0215 17:21:00.924438 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:21:00 crc kubenswrapper[4585]: I0215 17:21:00.934367 4585 generic.go:334] "Generic (PLEG): container finished" podID="2b2bae3d-b010-43cc-ad44-5573802b4517" containerID="8398705ed3c434e0f19b2d22e75c60b15b7c5aeef5327e7e52380ac7257c20bc" exitCode=0 Feb 15 17:21:00 crc kubenswrapper[4585]: I0215 17:21:00.935076 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ds72g" event={"ID":"2b2bae3d-b010-43cc-ad44-5573802b4517","Type":"ContainerDied","Data":"8398705ed3c434e0f19b2d22e75c60b15b7c5aeef5327e7e52380ac7257c20bc"} Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.144051 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-sk4vl"] Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.153298 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-sk4vl"] Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.372357 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.515053 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-sb\") pod \"2b2bae3d-b010-43cc-ad44-5573802b4517\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.515829 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-config\") pod \"2b2bae3d-b010-43cc-ad44-5573802b4517\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.515924 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-dns-svc\") pod \"2b2bae3d-b010-43cc-ad44-5573802b4517\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.516047 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z68fh\" (UniqueName: \"kubernetes.io/projected/2b2bae3d-b010-43cc-ad44-5573802b4517-kube-api-access-z68fh\") pod \"2b2bae3d-b010-43cc-ad44-5573802b4517\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.516092 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-nb\") pod \"2b2bae3d-b010-43cc-ad44-5573802b4517\" (UID: \"2b2bae3d-b010-43cc-ad44-5573802b4517\") " Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.532418 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2bae3d-b010-43cc-ad44-5573802b4517-kube-api-access-z68fh" (OuterVolumeSpecName: "kube-api-access-z68fh") pod "2b2bae3d-b010-43cc-ad44-5573802b4517" (UID: "2b2bae3d-b010-43cc-ad44-5573802b4517"). InnerVolumeSpecName "kube-api-access-z68fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.565943 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b2bae3d-b010-43cc-ad44-5573802b4517" (UID: "2b2bae3d-b010-43cc-ad44-5573802b4517"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.573126 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b2bae3d-b010-43cc-ad44-5573802b4517" (UID: "2b2bae3d-b010-43cc-ad44-5573802b4517"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.577236 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b2bae3d-b010-43cc-ad44-5573802b4517" (UID: "2b2bae3d-b010-43cc-ad44-5573802b4517"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.595794 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-config" (OuterVolumeSpecName: "config") pod "2b2bae3d-b010-43cc-ad44-5573802b4517" (UID: "2b2bae3d-b010-43cc-ad44-5573802b4517"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.618202 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.618230 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.618240 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z68fh\" (UniqueName: \"kubernetes.io/projected/2b2bae3d-b010-43cc-ad44-5573802b4517-kube-api-access-z68fh\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.618254 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.618263 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b2bae3d-b010-43cc-ad44-5573802b4517-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.681417 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7nlr5"] Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.942861 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6ls6x" event={"ID":"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6","Type":"ContainerStarted","Data":"e232555febe7a498c944c2c70323815ed84c553a02b650c342d3067f03e87d24"} Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.945955 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-ds72g" event={"ID":"2b2bae3d-b010-43cc-ad44-5573802b4517","Type":"ContainerDied","Data":"a764b55e48a5aa48e7eb67f339a9df13e872c390762e6f4551b20dc2dfe5fb00"} Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.946000 4585 scope.go:117] "RemoveContainer" containerID="8398705ed3c434e0f19b2d22e75c60b15b7c5aeef5327e7e52380ac7257c20bc" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.946126 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-ds72g" Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.950366 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4z4g6" event={"ID":"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac","Type":"ContainerStarted","Data":"515f133c401184a1132b0d4f79d67a63d223e52167033f30663a663999435c73"} Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.952560 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7nlr5" event={"ID":"d959ed97-3c6f-4503-864e-57104658b927","Type":"ContainerStarted","Data":"6a32a2a9550c52c12ae604365c7891563f9a2a57e1d750924e8f01196c4f0c90"} Feb 15 17:21:01 crc kubenswrapper[4585]: I0215 17:21:01.973896 4585 scope.go:117] "RemoveContainer" containerID="f634dd15c25ca67e29a95602961918227f95f77fde20cd8620915696830b8ee5" Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.005409 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-6ls6x" podStartSLOduration=1.98200854 podStartE2EDuration="7.005392833s" podCreationTimestamp="2026-02-15 17:20:55 +0000 UTC" firstStartedPulling="2026-02-15 17:20:56.056949102 +0000 UTC m=+912.000357234" lastFinishedPulling="2026-02-15 17:21:01.080333385 +0000 UTC m=+917.023741527" observedRunningTime="2026-02-15 17:21:01.966556894 +0000 UTC m=+917.909965026" watchObservedRunningTime="2026-02-15 17:21:02.005392833 +0000 UTC m=+917.948800965" Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.058351 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ds72g"] Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.064259 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-ds72g"] Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.109938 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jr8n"] Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.110137 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8jr8n" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerName="registry-server" containerID="cri-o://1cbdd76baa1eb5f1731d6420e1f3c4c9731ed2da2cf613832406017084f45b0e" gracePeriod=2 Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.867676 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b2bae3d-b010-43cc-ad44-5573802b4517" path="/var/lib/kubelet/pods/2b2bae3d-b010-43cc-ad44-5573802b4517/volumes" Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.868480 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706054f0-aa3c-4907-bca2-864e673da1d7" path="/var/lib/kubelet/pods/706054f0-aa3c-4907-bca2-864e673da1d7/volumes" Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.971942 4585 generic.go:334] "Generic (PLEG): container finished" podID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerID="515f133c401184a1132b0d4f79d67a63d223e52167033f30663a663999435c73" exitCode=0 Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.974584 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4z4g6" event={"ID":"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac","Type":"ContainerDied","Data":"515f133c401184a1132b0d4f79d67a63d223e52167033f30663a663999435c73"} Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.984506 4585 generic.go:334] "Generic (PLEG): container finished" podID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerID="1cbdd76baa1eb5f1731d6420e1f3c4c9731ed2da2cf613832406017084f45b0e" exitCode=0 Feb 15 17:21:02 crc kubenswrapper[4585]: I0215 17:21:02.985080 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jr8n" event={"ID":"0e2147ab-2614-4b64-9f18-a45bec1f4937","Type":"ContainerDied","Data":"1cbdd76baa1eb5f1731d6420e1f3c4c9731ed2da2cf613832406017084f45b0e"} Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.132670 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.200105 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpgvm\" (UniqueName: \"kubernetes.io/projected/0e2147ab-2614-4b64-9f18-a45bec1f4937-kube-api-access-fpgvm\") pod \"0e2147ab-2614-4b64-9f18-a45bec1f4937\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.200162 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-catalog-content\") pod \"0e2147ab-2614-4b64-9f18-a45bec1f4937\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.200247 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-utilities\") pod \"0e2147ab-2614-4b64-9f18-a45bec1f4937\" (UID: \"0e2147ab-2614-4b64-9f18-a45bec1f4937\") " Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.201015 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-utilities" (OuterVolumeSpecName: "utilities") pod "0e2147ab-2614-4b64-9f18-a45bec1f4937" (UID: "0e2147ab-2614-4b64-9f18-a45bec1f4937"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.223825 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e2147ab-2614-4b64-9f18-a45bec1f4937-kube-api-access-fpgvm" (OuterVolumeSpecName: "kube-api-access-fpgvm") pod "0e2147ab-2614-4b64-9f18-a45bec1f4937" (UID: "0e2147ab-2614-4b64-9f18-a45bec1f4937"). InnerVolumeSpecName "kube-api-access-fpgvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.251521 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e2147ab-2614-4b64-9f18-a45bec1f4937" (UID: "0e2147ab-2614-4b64-9f18-a45bec1f4937"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.301803 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.301836 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpgvm\" (UniqueName: \"kubernetes.io/projected/0e2147ab-2614-4b64-9f18-a45bec1f4937-kube-api-access-fpgvm\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.301849 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2147ab-2614-4b64-9f18-a45bec1f4937-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.996817 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jr8n" event={"ID":"0e2147ab-2614-4b64-9f18-a45bec1f4937","Type":"ContainerDied","Data":"6e8fb597f958a483e669d8652e9fdc5ef9c4f71c4095377845b722cf75306620"} Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.997077 4585 scope.go:117] "RemoveContainer" containerID="1cbdd76baa1eb5f1731d6420e1f3c4c9731ed2da2cf613832406017084f45b0e" Feb 15 17:21:03 crc kubenswrapper[4585]: I0215 17:21:03.996863 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jr8n" Feb 15 17:21:04 crc kubenswrapper[4585]: I0215 17:21:04.004558 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4z4g6" event={"ID":"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac","Type":"ContainerStarted","Data":"ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad"} Feb 15 17:21:04 crc kubenswrapper[4585]: I0215 17:21:04.047535 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4z4g6" podStartSLOduration=3.930765567 podStartE2EDuration="8.047516563s" podCreationTimestamp="2026-02-15 17:20:56 +0000 UTC" firstStartedPulling="2026-02-15 17:20:59.500675622 +0000 UTC m=+915.444083754" lastFinishedPulling="2026-02-15 17:21:03.617426618 +0000 UTC m=+919.560834750" observedRunningTime="2026-02-15 17:21:04.027142108 +0000 UTC m=+919.970550240" watchObservedRunningTime="2026-02-15 17:21:04.047516563 +0000 UTC m=+919.990924695" Feb 15 17:21:04 crc kubenswrapper[4585]: I0215 17:21:04.049679 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jr8n"] Feb 15 17:21:04 crc kubenswrapper[4585]: I0215 17:21:04.050730 4585 scope.go:117] "RemoveContainer" containerID="d2a01168e4e4a5d2788b0cae67b14d6557c565c8217471f1fc02ea8aa4bc3550" Feb 15 17:21:04 crc kubenswrapper[4585]: I0215 17:21:04.056787 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8jr8n"] Feb 15 17:21:04 crc kubenswrapper[4585]: I0215 17:21:04.101303 4585 scope.go:117] "RemoveContainer" containerID="67a34be8280d6d2510a13be9fdc5422b73bec7c2221a31f9c318d093148d9b13" Feb 15 17:21:04 crc kubenswrapper[4585]: I0215 17:21:04.855613 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" path="/var/lib/kubelet/pods/0e2147ab-2614-4b64-9f18-a45bec1f4937/volumes" Feb 15 17:21:05 crc kubenswrapper[4585]: I0215 17:21:05.398509 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.162353 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-f248v"] Feb 15 17:21:06 crc kubenswrapper[4585]: E0215 17:21:06.162838 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2bae3d-b010-43cc-ad44-5573802b4517" containerName="init" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.162864 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2bae3d-b010-43cc-ad44-5573802b4517" containerName="init" Feb 15 17:21:06 crc kubenswrapper[4585]: E0215 17:21:06.162875 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerName="extract-content" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.162883 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerName="extract-content" Feb 15 17:21:06 crc kubenswrapper[4585]: E0215 17:21:06.162898 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerName="extract-utilities" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.162906 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerName="extract-utilities" Feb 15 17:21:06 crc kubenswrapper[4585]: E0215 17:21:06.162926 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2bae3d-b010-43cc-ad44-5573802b4517" containerName="dnsmasq-dns" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.162932 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2bae3d-b010-43cc-ad44-5573802b4517" containerName="dnsmasq-dns" Feb 15 17:21:06 crc kubenswrapper[4585]: E0215 17:21:06.162957 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerName="registry-server" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.162965 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerName="registry-server" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.163214 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e2147ab-2614-4b64-9f18-a45bec1f4937" containerName="registry-server" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.163250 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2bae3d-b010-43cc-ad44-5573802b4517" containerName="dnsmasq-dns" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.163857 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-f248v" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.165494 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.176925 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-f248v"] Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.280571 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1e0d19-52a5-4959-9e9f-74094993a95c-operator-scripts\") pod \"root-account-create-update-f248v\" (UID: \"0b1e0d19-52a5-4959-9e9f-74094993a95c\") " pod="openstack/root-account-create-update-f248v" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.281019 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2kxd\" (UniqueName: \"kubernetes.io/projected/0b1e0d19-52a5-4959-9e9f-74094993a95c-kube-api-access-z2kxd\") pod \"root-account-create-update-f248v\" (UID: \"0b1e0d19-52a5-4959-9e9f-74094993a95c\") " pod="openstack/root-account-create-update-f248v" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.382837 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1e0d19-52a5-4959-9e9f-74094993a95c-operator-scripts\") pod \"root-account-create-update-f248v\" (UID: \"0b1e0d19-52a5-4959-9e9f-74094993a95c\") " pod="openstack/root-account-create-update-f248v" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.382923 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2kxd\" (UniqueName: \"kubernetes.io/projected/0b1e0d19-52a5-4959-9e9f-74094993a95c-kube-api-access-z2kxd\") pod \"root-account-create-update-f248v\" (UID: \"0b1e0d19-52a5-4959-9e9f-74094993a95c\") " pod="openstack/root-account-create-update-f248v" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.385006 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1e0d19-52a5-4959-9e9f-74094993a95c-operator-scripts\") pod \"root-account-create-update-f248v\" (UID: \"0b1e0d19-52a5-4959-9e9f-74094993a95c\") " pod="openstack/root-account-create-update-f248v" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.402391 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2kxd\" (UniqueName: \"kubernetes.io/projected/0b1e0d19-52a5-4959-9e9f-74094993a95c-kube-api-access-z2kxd\") pod \"root-account-create-update-f248v\" (UID: \"0b1e0d19-52a5-4959-9e9f-74094993a95c\") " pod="openstack/root-account-create-update-f248v" Feb 15 17:21:06 crc kubenswrapper[4585]: I0215 17:21:06.491757 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-f248v" Feb 15 17:21:07 crc kubenswrapper[4585]: I0215 17:21:07.068346 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-f248v"] Feb 15 17:21:07 crc kubenswrapper[4585]: I0215 17:21:07.201890 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:21:07 crc kubenswrapper[4585]: E0215 17:21:07.202134 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 15 17:21:07 crc kubenswrapper[4585]: E0215 17:21:07.202187 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 15 17:21:07 crc kubenswrapper[4585]: E0215 17:21:07.202275 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift podName:92c8f627-225d-40ac-827b-d2f3476c1768 nodeName:}" failed. No retries permitted until 2026-02-15 17:21:23.202233403 +0000 UTC m=+939.145641545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift") pod "swift-storage-0" (UID: "92c8f627-225d-40ac-827b-d2f3476c1768") : configmap "swift-ring-files" not found Feb 15 17:21:07 crc kubenswrapper[4585]: I0215 17:21:07.269140 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:21:07 crc kubenswrapper[4585]: I0215 17:21:07.269194 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:21:07 crc kubenswrapper[4585]: I0215 17:21:07.330029 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:21:08 crc kubenswrapper[4585]: I0215 17:21:08.046677 4585 generic.go:334] "Generic (PLEG): container finished" podID="0b1e0d19-52a5-4959-9e9f-74094993a95c" containerID="a113f417811566209ff92b27bc2faec207eb4ff9fabbbfb28ee4d56728279113" exitCode=0 Feb 15 17:21:08 crc kubenswrapper[4585]: I0215 17:21:08.046847 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-f248v" event={"ID":"0b1e0d19-52a5-4959-9e9f-74094993a95c","Type":"ContainerDied","Data":"a113f417811566209ff92b27bc2faec207eb4ff9fabbbfb28ee4d56728279113"} Feb 15 17:21:08 crc kubenswrapper[4585]: I0215 17:21:08.046901 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-f248v" event={"ID":"0b1e0d19-52a5-4959-9e9f-74094993a95c","Type":"ContainerStarted","Data":"3ad8129785b85c04dadd6df09455c9038647021e4cd0f4f882482990e7aaa346"} Feb 15 17:21:08 crc kubenswrapper[4585]: I0215 17:21:08.097941 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:21:08 crc kubenswrapper[4585]: I0215 17:21:08.475883 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-87pkc" podUID="2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9" containerName="ovn-controller" probeResult="failure" output=< Feb 15 17:21:08 crc kubenswrapper[4585]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 15 17:21:08 crc kubenswrapper[4585]: > Feb 15 17:21:09 crc kubenswrapper[4585]: I0215 17:21:09.054768 4585 generic.go:334] "Generic (PLEG): container finished" podID="fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" containerID="e232555febe7a498c944c2c70323815ed84c553a02b650c342d3067f03e87d24" exitCode=0 Feb 15 17:21:09 crc kubenswrapper[4585]: I0215 17:21:09.054866 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6ls6x" event={"ID":"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6","Type":"ContainerDied","Data":"e232555febe7a498c944c2c70323815ed84c553a02b650c342d3067f03e87d24"} Feb 15 17:21:09 crc kubenswrapper[4585]: I0215 17:21:09.322550 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4z4g6"] Feb 15 17:21:10 crc kubenswrapper[4585]: I0215 17:21:10.064511 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4z4g6" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerName="registry-server" containerID="cri-o://ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad" gracePeriod=2 Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.081541 4585 generic.go:334] "Generic (PLEG): container finished" podID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerID="ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad" exitCode=0 Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.081815 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4z4g6" event={"ID":"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac","Type":"ContainerDied","Data":"ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad"} Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.757689 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kwgww"] Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.763330 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.783978 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwgww"] Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.802546 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-catalog-content\") pod \"redhat-operators-kwgww\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.802950 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-utilities\") pod \"redhat-operators-kwgww\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.803211 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjsk2\" (UniqueName: \"kubernetes.io/projected/4e9a733a-04ee-4eff-bf0a-91f112778d91-kube-api-access-pjsk2\") pod \"redhat-operators-kwgww\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.905266 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjsk2\" (UniqueName: \"kubernetes.io/projected/4e9a733a-04ee-4eff-bf0a-91f112778d91-kube-api-access-pjsk2\") pod \"redhat-operators-kwgww\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.906941 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-catalog-content\") pod \"redhat-operators-kwgww\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.907097 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-utilities\") pod \"redhat-operators-kwgww\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.907797 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-catalog-content\") pod \"redhat-operators-kwgww\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.907831 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-utilities\") pod \"redhat-operators-kwgww\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:11 crc kubenswrapper[4585]: I0215 17:21:11.924947 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjsk2\" (UniqueName: \"kubernetes.io/projected/4e9a733a-04ee-4eff-bf0a-91f112778d91-kube-api-access-pjsk2\") pod \"redhat-operators-kwgww\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:12 crc kubenswrapper[4585]: I0215 17:21:12.084962 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.482514 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-87pkc" podUID="2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9" containerName="ovn-controller" probeResult="failure" output=< Feb 15 17:21:13 crc kubenswrapper[4585]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 15 17:21:13 crc kubenswrapper[4585]: > Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.514972 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.520474 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hzct7" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.771020 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-87pkc-config-mdqlf"] Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.772232 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.798167 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.824551 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-87pkc-config-mdqlf"] Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.874488 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run-ovn\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.874912 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-scripts\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.875033 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cctcf\" (UniqueName: \"kubernetes.io/projected/9dd44da9-95a1-471a-99de-19e56ac2f90b-kube-api-access-cctcf\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.875085 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-log-ovn\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.875207 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-additional-scripts\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.875273 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.976920 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cctcf\" (UniqueName: \"kubernetes.io/projected/9dd44da9-95a1-471a-99de-19e56ac2f90b-kube-api-access-cctcf\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.976996 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-log-ovn\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.977078 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-additional-scripts\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.977121 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.977161 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run-ovn\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.977202 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-scripts\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.978058 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.978062 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-log-ovn\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.978170 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run-ovn\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.978655 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-additional-scripts\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.981643 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-scripts\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:13 crc kubenswrapper[4585]: I0215 17:21:13.997684 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cctcf\" (UniqueName: \"kubernetes.io/projected/9dd44da9-95a1-471a-99de-19e56ac2f90b-kube-api-access-cctcf\") pod \"ovn-controller-87pkc-config-mdqlf\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:14 crc kubenswrapper[4585]: I0215 17:21:14.104992 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:16 crc kubenswrapper[4585]: I0215 17:21:16.133652 4585 generic.go:334] "Generic (PLEG): container finished" podID="9de65e3c-3874-4fc0-9566-84138bb228b7" containerID="2dc9046518fb8df2747e6c9324c0f5f19fc85c8b9ac0bc03342da2ba06ce681d" exitCode=0 Feb 15 17:21:16 crc kubenswrapper[4585]: I0215 17:21:16.133724 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9de65e3c-3874-4fc0-9566-84138bb228b7","Type":"ContainerDied","Data":"2dc9046518fb8df2747e6c9324c0f5f19fc85c8b9ac0bc03342da2ba06ce681d"} Feb 15 17:21:17 crc kubenswrapper[4585]: I0215 17:21:17.015053 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:21:17 crc kubenswrapper[4585]: I0215 17:21:17.015138 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:21:17 crc kubenswrapper[4585]: I0215 17:21:17.015199 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:21:17 crc kubenswrapper[4585]: I0215 17:21:17.016345 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0cc3960491aa7365ef9a992dbe57461170d7a99f094dd61fc5fce5575354ba90"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:21:17 crc kubenswrapper[4585]: I0215 17:21:17.016512 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://0cc3960491aa7365ef9a992dbe57461170d7a99f094dd61fc5fce5575354ba90" gracePeriod=600 Feb 15 17:21:17 crc kubenswrapper[4585]: E0215 17:21:17.269885 4585 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad is running failed: container process not found" containerID="ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad" cmd=["grpc_health_probe","-addr=:50051"] Feb 15 17:21:17 crc kubenswrapper[4585]: E0215 17:21:17.270472 4585 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad is running failed: container process not found" containerID="ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad" cmd=["grpc_health_probe","-addr=:50051"] Feb 15 17:21:17 crc kubenswrapper[4585]: E0215 17:21:17.270795 4585 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad is running failed: container process not found" containerID="ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad" cmd=["grpc_health_probe","-addr=:50051"] Feb 15 17:21:17 crc kubenswrapper[4585]: E0215 17:21:17.270838 4585 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-4z4g6" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerName="registry-server" Feb 15 17:21:17 crc kubenswrapper[4585]: I0215 17:21:17.887969 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:21:17 crc kubenswrapper[4585]: I0215 17:21:17.894391 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-f248v" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.051455 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzwhl\" (UniqueName: \"kubernetes.io/projected/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-kube-api-access-xzwhl\") pod \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.051662 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-swiftconf\") pod \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.051721 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-ring-data-devices\") pod \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.051748 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-etc-swift\") pod \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.051792 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1e0d19-52a5-4959-9e9f-74094993a95c-operator-scripts\") pod \"0b1e0d19-52a5-4959-9e9f-74094993a95c\" (UID: \"0b1e0d19-52a5-4959-9e9f-74094993a95c\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.051828 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-combined-ca-bundle\") pod \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.051855 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2kxd\" (UniqueName: \"kubernetes.io/projected/0b1e0d19-52a5-4959-9e9f-74094993a95c-kube-api-access-z2kxd\") pod \"0b1e0d19-52a5-4959-9e9f-74094993a95c\" (UID: \"0b1e0d19-52a5-4959-9e9f-74094993a95c\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.051895 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-scripts\") pod \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.051936 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-dispersionconf\") pod \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\" (UID: \"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.053902 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" (UID: "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.055875 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-kube-api-access-xzwhl" (OuterVolumeSpecName: "kube-api-access-xzwhl") pod "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" (UID: "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6"). InnerVolumeSpecName "kube-api-access-xzwhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.056377 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b1e0d19-52a5-4959-9e9f-74094993a95c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b1e0d19-52a5-4959-9e9f-74094993a95c" (UID: "0b1e0d19-52a5-4959-9e9f-74094993a95c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.056385 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" (UID: "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.057222 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1e0d19-52a5-4959-9e9f-74094993a95c-kube-api-access-z2kxd" (OuterVolumeSpecName: "kube-api-access-z2kxd") pod "0b1e0d19-52a5-4959-9e9f-74094993a95c" (UID: "0b1e0d19-52a5-4959-9e9f-74094993a95c"). InnerVolumeSpecName "kube-api-access-z2kxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.075662 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" (UID: "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.102080 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" (UID: "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.104869 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-scripts" (OuterVolumeSpecName: "scripts") pod "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" (UID: "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.120921 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" (UID: "fd5d7c58-38f8-40cb-89c0-6f97f6063ca6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.153310 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6ls6x" event={"ID":"fd5d7c58-38f8-40cb-89c0-6f97f6063ca6","Type":"ContainerDied","Data":"86b40e8c76cc7a1eb56be6aea205b538c51af4c23d2d2451eeff9a7503c00ca2"} Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.153347 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86b40e8c76cc7a1eb56be6aea205b538c51af4c23d2d2451eeff9a7503c00ca2" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.153403 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6ls6x" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.154014 4585 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.154046 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzwhl\" (UniqueName: \"kubernetes.io/projected/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-kube-api-access-xzwhl\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.154057 4585 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.154068 4585 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.154076 4585 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.154084 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1e0d19-52a5-4959-9e9f-74094993a95c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.154092 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.154100 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2kxd\" (UniqueName: \"kubernetes.io/projected/0b1e0d19-52a5-4959-9e9f-74094993a95c-kube-api-access-z2kxd\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.154108 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd5d7c58-38f8-40cb-89c0-6f97f6063ca6-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.157504 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9de65e3c-3874-4fc0-9566-84138bb228b7","Type":"ContainerStarted","Data":"606ccdf97b6fe9efd4ac0257cc29eae4f4f20499e41c08fdaeee74187a483060"} Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.157798 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.160076 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-f248v" event={"ID":"0b1e0d19-52a5-4959-9e9f-74094993a95c","Type":"ContainerDied","Data":"3ad8129785b85c04dadd6df09455c9038647021e4cd0f4f882482990e7aaa346"} Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.160102 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ad8129785b85c04dadd6df09455c9038647021e4cd0f4f882482990e7aaa346" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.160337 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-f248v" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.169606 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.171585 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="0cc3960491aa7365ef9a992dbe57461170d7a99f094dd61fc5fce5575354ba90" exitCode=0 Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.171638 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"0cc3960491aa7365ef9a992dbe57461170d7a99f094dd61fc5fce5575354ba90"} Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.171662 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"6f05a1d156be3c851f680780cf5a4d67dc38d38043f804ea0899d1efe3927d68"} Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.171676 4585 scope.go:117] "RemoveContainer" containerID="66bd3998ff2493b6c4431c56b818df1c025a69a1e07091de641e0ebe4853beee" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.183538 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.818451597 podStartE2EDuration="1m15.183517094s" podCreationTimestamp="2026-02-15 17:20:03 +0000 UTC" firstStartedPulling="2026-02-15 17:20:05.378029558 +0000 UTC m=+861.321437690" lastFinishedPulling="2026-02-15 17:20:42.743095055 +0000 UTC m=+898.686503187" observedRunningTime="2026-02-15 17:21:18.177166111 +0000 UTC m=+934.120574243" watchObservedRunningTime="2026-02-15 17:21:18.183517094 +0000 UTC m=+934.126925216" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.255257 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t48j4\" (UniqueName: \"kubernetes.io/projected/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-kube-api-access-t48j4\") pod \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.255521 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-catalog-content\") pod \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.255560 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-utilities\") pod \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\" (UID: \"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac\") " Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.256962 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-utilities" (OuterVolumeSpecName: "utilities") pod "d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" (UID: "d50173d4-2ad6-4d52-92cb-e357aaa1e8ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.260562 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-kube-api-access-t48j4" (OuterVolumeSpecName: "kube-api-access-t48j4") pod "d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" (UID: "d50173d4-2ad6-4d52-92cb-e357aaa1e8ac"). InnerVolumeSpecName "kube-api-access-t48j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.326466 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwgww"] Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.339668 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" (UID: "d50173d4-2ad6-4d52-92cb-e357aaa1e8ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:21:18 crc kubenswrapper[4585]: W0215 17:21:18.346054 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e9a733a_04ee_4eff_bf0a_91f112778d91.slice/crio-db5e8f7d3e4b31b42a0fbe1dca9cb284675fedd07f6af8e11d2d9eb6659873e6 WatchSource:0}: Error finding container db5e8f7d3e4b31b42a0fbe1dca9cb284675fedd07f6af8e11d2d9eb6659873e6: Status 404 returned error can't find the container with id db5e8f7d3e4b31b42a0fbe1dca9cb284675fedd07f6af8e11d2d9eb6659873e6 Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.358036 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t48j4\" (UniqueName: \"kubernetes.io/projected/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-kube-api-access-t48j4\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.358062 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.358071 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.377409 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-87pkc-config-mdqlf"] Feb 15 17:21:18 crc kubenswrapper[4585]: W0215 17:21:18.396232 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dd44da9_95a1_471a_99de_19e56ac2f90b.slice/crio-06a93b021aaacacbd909fdb155f1d73472011d959be9f2605ec4732da2f3ac3a WatchSource:0}: Error finding container 06a93b021aaacacbd909fdb155f1d73472011d959be9f2605ec4732da2f3ac3a: Status 404 returned error can't find the container with id 06a93b021aaacacbd909fdb155f1d73472011d959be9f2605ec4732da2f3ac3a Feb 15 17:21:18 crc kubenswrapper[4585]: I0215 17:21:18.542489 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-87pkc" podUID="2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9" containerName="ovn-controller" probeResult="failure" output=< Feb 15 17:21:18 crc kubenswrapper[4585]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 15 17:21:18 crc kubenswrapper[4585]: > Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.183034 4585 generic.go:334] "Generic (PLEG): container finished" podID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerID="8038e0cf928e36207cf5c0ab2f488c3b6bd97e523c11afc4d46049a8e3584012" exitCode=0 Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.183219 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwgww" event={"ID":"4e9a733a-04ee-4eff-bf0a-91f112778d91","Type":"ContainerDied","Data":"8038e0cf928e36207cf5c0ab2f488c3b6bd97e523c11afc4d46049a8e3584012"} Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.183517 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwgww" event={"ID":"4e9a733a-04ee-4eff-bf0a-91f112778d91","Type":"ContainerStarted","Data":"db5e8f7d3e4b31b42a0fbe1dca9cb284675fedd07f6af8e11d2d9eb6659873e6"} Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.207158 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4z4g6" event={"ID":"d50173d4-2ad6-4d52-92cb-e357aaa1e8ac","Type":"ContainerDied","Data":"4bd7bb775bad204a7a929d08e1ad1848998e83c914b077f74b20f4a34cd499f0"} Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.207204 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4z4g6" Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.207209 4585 scope.go:117] "RemoveContainer" containerID="ca1100724c45b3553cc3b7d65457f6ca81132e7a03ded7da534282c67f106bad" Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.216119 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7nlr5" event={"ID":"d959ed97-3c6f-4503-864e-57104658b927","Type":"ContainerStarted","Data":"c6e2eeb157b98e31a26fb4f504ef0d56b2f0e8954627bf92596f438ac46a0495"} Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.218855 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-87pkc-config-mdqlf" event={"ID":"9dd44da9-95a1-471a-99de-19e56ac2f90b","Type":"ContainerStarted","Data":"d8e5e4394e1c825f0599f2cf17af2ea8fe55d5ff071f368d7c7fcc9f120dd3e6"} Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.218875 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-87pkc-config-mdqlf" event={"ID":"9dd44da9-95a1-471a-99de-19e56ac2f90b","Type":"ContainerStarted","Data":"06a93b021aaacacbd909fdb155f1d73472011d959be9f2605ec4732da2f3ac3a"} Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.237758 4585 scope.go:117] "RemoveContainer" containerID="515f133c401184a1132b0d4f79d67a63d223e52167033f30663a663999435c73" Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.322276 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4z4g6"] Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.325770 4585 scope.go:117] "RemoveContainer" containerID="8aa4fe64ca62dadc07d5818e79a7db0e6fb71fb3a84d485f2c7df6711af125df" Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.329475 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4z4g6"] Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.361670 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-87pkc-config-mdqlf" podStartSLOduration=6.3616548 podStartE2EDuration="6.3616548s" podCreationTimestamp="2026-02-15 17:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:19.354889376 +0000 UTC m=+935.298297508" watchObservedRunningTime="2026-02-15 17:21:19.3616548 +0000 UTC m=+935.305062922" Feb 15 17:21:19 crc kubenswrapper[4585]: I0215 17:21:19.454393 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7nlr5" podStartSLOduration=5.292649885 podStartE2EDuration="21.454378629s" podCreationTimestamp="2026-02-15 17:20:58 +0000 UTC" firstStartedPulling="2026-02-15 17:21:01.69533605 +0000 UTC m=+917.638744182" lastFinishedPulling="2026-02-15 17:21:17.857064794 +0000 UTC m=+933.800472926" observedRunningTime="2026-02-15 17:21:19.441051745 +0000 UTC m=+935.384459877" watchObservedRunningTime="2026-02-15 17:21:19.454378629 +0000 UTC m=+935.397786761" Feb 15 17:21:20 crc kubenswrapper[4585]: I0215 17:21:20.227988 4585 generic.go:334] "Generic (PLEG): container finished" podID="9dd44da9-95a1-471a-99de-19e56ac2f90b" containerID="d8e5e4394e1c825f0599f2cf17af2ea8fe55d5ff071f368d7c7fcc9f120dd3e6" exitCode=0 Feb 15 17:21:20 crc kubenswrapper[4585]: I0215 17:21:20.228108 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-87pkc-config-mdqlf" event={"ID":"9dd44da9-95a1-471a-99de-19e56ac2f90b","Type":"ContainerDied","Data":"d8e5e4394e1c825f0599f2cf17af2ea8fe55d5ff071f368d7c7fcc9f120dd3e6"} Feb 15 17:21:20 crc kubenswrapper[4585]: I0215 17:21:20.230457 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwgww" event={"ID":"4e9a733a-04ee-4eff-bf0a-91f112778d91","Type":"ContainerStarted","Data":"a54af59911e03d5335a4ecf6cc48dfe7766178a3c85e9656fd5566ce551bb80c"} Feb 15 17:21:20 crc kubenswrapper[4585]: I0215 17:21:20.854436 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" path="/var/lib/kubelet/pods/d50173d4-2ad6-4d52-92cb-e357aaa1e8ac/volumes" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.608440 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.734296 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run\") pod \"9dd44da9-95a1-471a-99de-19e56ac2f90b\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.734379 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-scripts\") pod \"9dd44da9-95a1-471a-99de-19e56ac2f90b\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.734410 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cctcf\" (UniqueName: \"kubernetes.io/projected/9dd44da9-95a1-471a-99de-19e56ac2f90b-kube-api-access-cctcf\") pod \"9dd44da9-95a1-471a-99de-19e56ac2f90b\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.734449 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-log-ovn\") pod \"9dd44da9-95a1-471a-99de-19e56ac2f90b\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.734481 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run-ovn\") pod \"9dd44da9-95a1-471a-99de-19e56ac2f90b\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.734582 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-additional-scripts\") pod \"9dd44da9-95a1-471a-99de-19e56ac2f90b\" (UID: \"9dd44da9-95a1-471a-99de-19e56ac2f90b\") " Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.735530 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run" (OuterVolumeSpecName: "var-run") pod "9dd44da9-95a1-471a-99de-19e56ac2f90b" (UID: "9dd44da9-95a1-471a-99de-19e56ac2f90b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.735616 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9dd44da9-95a1-471a-99de-19e56ac2f90b" (UID: "9dd44da9-95a1-471a-99de-19e56ac2f90b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.735622 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9dd44da9-95a1-471a-99de-19e56ac2f90b" (UID: "9dd44da9-95a1-471a-99de-19e56ac2f90b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.735671 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9dd44da9-95a1-471a-99de-19e56ac2f90b" (UID: "9dd44da9-95a1-471a-99de-19e56ac2f90b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.736910 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-scripts" (OuterVolumeSpecName: "scripts") pod "9dd44da9-95a1-471a-99de-19e56ac2f90b" (UID: "9dd44da9-95a1-471a-99de-19e56ac2f90b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.762252 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd44da9-95a1-471a-99de-19e56ac2f90b-kube-api-access-cctcf" (OuterVolumeSpecName: "kube-api-access-cctcf") pod "9dd44da9-95a1-471a-99de-19e56ac2f90b" (UID: "9dd44da9-95a1-471a-99de-19e56ac2f90b"). InnerVolumeSpecName "kube-api-access-cctcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.836985 4585 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.837025 4585 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.837038 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dd44da9-95a1-471a-99de-19e56ac2f90b-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.837050 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cctcf\" (UniqueName: \"kubernetes.io/projected/9dd44da9-95a1-471a-99de-19e56ac2f90b-kube-api-access-cctcf\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.837064 4585 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:21 crc kubenswrapper[4585]: I0215 17:21:21.837077 4585 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9dd44da9-95a1-471a-99de-19e56ac2f90b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:22 crc kubenswrapper[4585]: I0215 17:21:22.253957 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-87pkc-config-mdqlf" event={"ID":"9dd44da9-95a1-471a-99de-19e56ac2f90b","Type":"ContainerDied","Data":"06a93b021aaacacbd909fdb155f1d73472011d959be9f2605ec4732da2f3ac3a"} Feb 15 17:21:22 crc kubenswrapper[4585]: I0215 17:21:22.254267 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06a93b021aaacacbd909fdb155f1d73472011d959be9f2605ec4732da2f3ac3a" Feb 15 17:21:22 crc kubenswrapper[4585]: I0215 17:21:22.254030 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-87pkc-config-mdqlf" Feb 15 17:21:22 crc kubenswrapper[4585]: I0215 17:21:22.829226 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-87pkc-config-mdqlf"] Feb 15 17:21:22 crc kubenswrapper[4585]: I0215 17:21:22.838887 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-87pkc-config-mdqlf"] Feb 15 17:21:22 crc kubenswrapper[4585]: I0215 17:21:22.852938 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd44da9-95a1-471a-99de-19e56ac2f90b" path="/var/lib/kubelet/pods/9dd44da9-95a1-471a-99de-19e56ac2f90b/volumes" Feb 15 17:21:23 crc kubenswrapper[4585]: I0215 17:21:23.264456 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:21:23 crc kubenswrapper[4585]: I0215 17:21:23.271871 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92c8f627-225d-40ac-827b-d2f3476c1768-etc-swift\") pod \"swift-storage-0\" (UID: \"92c8f627-225d-40ac-827b-d2f3476c1768\") " pod="openstack/swift-storage-0" Feb 15 17:21:23 crc kubenswrapper[4585]: I0215 17:21:23.473171 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 15 17:21:23 crc kubenswrapper[4585]: I0215 17:21:23.504180 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-87pkc" Feb 15 17:21:23 crc kubenswrapper[4585]: I0215 17:21:23.992098 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 15 17:21:24 crc kubenswrapper[4585]: I0215 17:21:24.276639 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"3b2712639fb122990ac6a47ccd04c83c5b3d9019a43604f9a8f336b3706702c5"} Feb 15 17:21:25 crc kubenswrapper[4585]: I0215 17:21:25.289874 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"f388b280ef166ed917420bea94cc91e5a7b69a21d2f79b7d9a8479d1bfcb9f03"} Feb 15 17:21:26 crc kubenswrapper[4585]: I0215 17:21:26.303552 4585 generic.go:334] "Generic (PLEG): container finished" podID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerID="a54af59911e03d5335a4ecf6cc48dfe7766178a3c85e9656fd5566ce551bb80c" exitCode=0 Feb 15 17:21:26 crc kubenswrapper[4585]: I0215 17:21:26.303675 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwgww" event={"ID":"4e9a733a-04ee-4eff-bf0a-91f112778d91","Type":"ContainerDied","Data":"a54af59911e03d5335a4ecf6cc48dfe7766178a3c85e9656fd5566ce551bb80c"} Feb 15 17:21:26 crc kubenswrapper[4585]: I0215 17:21:26.308992 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"1b1d2acdde6351d100bc0a075c3760b67612e5f56fb9d67741711b8be933f272"} Feb 15 17:21:26 crc kubenswrapper[4585]: I0215 17:21:26.309022 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"dfb414d0d72f6b7b6a5994b632fbe008cc4e6f66acf941d382f33ad1f8cc7753"} Feb 15 17:21:27 crc kubenswrapper[4585]: I0215 17:21:27.331415 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwgww" event={"ID":"4e9a733a-04ee-4eff-bf0a-91f112778d91","Type":"ContainerStarted","Data":"249ff933754966b2c1225475204a7f5a4f7b6c40fac64bf98b56e5c20e7b128d"} Feb 15 17:21:27 crc kubenswrapper[4585]: I0215 17:21:27.337842 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"4efe7f55127fade586823a1fcdd47b916a2e88a663292a1cd08d37fc9173d375"} Feb 15 17:21:27 crc kubenswrapper[4585]: I0215 17:21:27.365883 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kwgww" podStartSLOduration=8.844961446 podStartE2EDuration="16.365866133s" podCreationTimestamp="2026-02-15 17:21:11 +0000 UTC" firstStartedPulling="2026-02-15 17:21:19.184867271 +0000 UTC m=+935.128275403" lastFinishedPulling="2026-02-15 17:21:26.705771928 +0000 UTC m=+942.649180090" observedRunningTime="2026-02-15 17:21:27.360436045 +0000 UTC m=+943.303844187" watchObservedRunningTime="2026-02-15 17:21:27.365866133 +0000 UTC m=+943.309274265" Feb 15 17:21:28 crc kubenswrapper[4585]: I0215 17:21:28.361985 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"56d120cc26dbe783df80bc227bbc3f8bca17177d0a0c34363116de8eb1dc7d84"} Feb 15 17:21:28 crc kubenswrapper[4585]: I0215 17:21:28.362298 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"028e15625ce31ca76c6f112165319dc36379ba80580f05ded3138809c22c12a6"} Feb 15 17:21:28 crc kubenswrapper[4585]: I0215 17:21:28.362311 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"cefbe24a354ce93eec3513ba0765d61f08ec3ac34cab908e300f15c24125ca96"} Feb 15 17:21:29 crc kubenswrapper[4585]: I0215 17:21:29.383507 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"f865f777725f57778e4cae57a67fb30bf26317a214e2f58e625f487d353dbfce"} Feb 15 17:21:30 crc kubenswrapper[4585]: I0215 17:21:30.395862 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"a3b3ca560c4bf35db8ac104e002e367847b24ad9f86d94a76c567f6bdf55f590"} Feb 15 17:21:30 crc kubenswrapper[4585]: I0215 17:21:30.396120 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"e06a93b163cea08f4634c75a7f6417b48ed1dc322800864cf885451caba829ea"} Feb 15 17:21:30 crc kubenswrapper[4585]: I0215 17:21:30.396131 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"c861f2a6ad1637b65733482d62d66057226d8573272a600c632c10c53154c823"} Feb 15 17:21:30 crc kubenswrapper[4585]: I0215 17:21:30.397475 4585 generic.go:334] "Generic (PLEG): container finished" podID="d959ed97-3c6f-4503-864e-57104658b927" containerID="c6e2eeb157b98e31a26fb4f504ef0d56b2f0e8954627bf92596f438ac46a0495" exitCode=0 Feb 15 17:21:30 crc kubenswrapper[4585]: I0215 17:21:30.397500 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7nlr5" event={"ID":"d959ed97-3c6f-4503-864e-57104658b927","Type":"ContainerDied","Data":"c6e2eeb157b98e31a26fb4f504ef0d56b2f0e8954627bf92596f438ac46a0495"} Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.416494 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"583272491d3b4fe346065f02d841c82518d847d46862467c48a5a45c07966bbd"} Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.416811 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"97eddd84f4c49068e63b831c48314a49be46112f1fe7a5e20703e548b480cc3e"} Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.416822 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"1f5274f74c80398f2e56de4e9b6eb6b1fc40a04a6bf0c2b1dc06160190c83f2b"} Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.416830 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92c8f627-225d-40ac-827b-d2f3476c1768","Type":"ContainerStarted","Data":"6a094703a7071d9dccdc64221bf2f2d849aa65a6b5fd5835f0c22f9fc064c1d2"} Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.447051 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.899590211 podStartE2EDuration="41.447036678s" podCreationTimestamp="2026-02-15 17:20:50 +0000 UTC" firstStartedPulling="2026-02-15 17:21:24.02067117 +0000 UTC m=+939.964079312" lastFinishedPulling="2026-02-15 17:21:29.568117647 +0000 UTC m=+945.511525779" observedRunningTime="2026-02-15 17:21:31.444445988 +0000 UTC m=+947.387854120" watchObservedRunningTime="2026-02-15 17:21:31.447036678 +0000 UTC m=+947.390444810" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.834924 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-krgtj"] Feb 15 17:21:31 crc kubenswrapper[4585]: E0215 17:21:31.835734 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerName="registry-server" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.835756 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerName="registry-server" Feb 15 17:21:31 crc kubenswrapper[4585]: E0215 17:21:31.835777 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerName="extract-utilities" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.835786 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerName="extract-utilities" Feb 15 17:21:31 crc kubenswrapper[4585]: E0215 17:21:31.835825 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1e0d19-52a5-4959-9e9f-74094993a95c" containerName="mariadb-account-create-update" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.835835 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1e0d19-52a5-4959-9e9f-74094993a95c" containerName="mariadb-account-create-update" Feb 15 17:21:31 crc kubenswrapper[4585]: E0215 17:21:31.835852 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerName="extract-content" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.835860 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerName="extract-content" Feb 15 17:21:31 crc kubenswrapper[4585]: E0215 17:21:31.835879 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd44da9-95a1-471a-99de-19e56ac2f90b" containerName="ovn-config" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.835887 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd44da9-95a1-471a-99de-19e56ac2f90b" containerName="ovn-config" Feb 15 17:21:31 crc kubenswrapper[4585]: E0215 17:21:31.835910 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" containerName="swift-ring-rebalance" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.835920 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" containerName="swift-ring-rebalance" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.836164 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d50173d4-2ad6-4d52-92cb-e357aaa1e8ac" containerName="registry-server" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.836194 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd44da9-95a1-471a-99de-19e56ac2f90b" containerName="ovn-config" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.836221 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5d7c58-38f8-40cb-89c0-6f97f6063ca6" containerName="swift-ring-rebalance" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.836233 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1e0d19-52a5-4959-9e9f-74094993a95c" containerName="mariadb-account-create-update" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.837360 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.846697 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-krgtj"] Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.847804 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.875104 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7nlr5" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930135 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-db-sync-config-data\") pod \"d959ed97-3c6f-4503-864e-57104658b927\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930246 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp2rv\" (UniqueName: \"kubernetes.io/projected/d959ed97-3c6f-4503-864e-57104658b927-kube-api-access-dp2rv\") pod \"d959ed97-3c6f-4503-864e-57104658b927\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930309 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-combined-ca-bundle\") pod \"d959ed97-3c6f-4503-864e-57104658b927\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930336 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-config-data\") pod \"d959ed97-3c6f-4503-864e-57104658b927\" (UID: \"d959ed97-3c6f-4503-864e-57104658b927\") " Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930643 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-config\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930667 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930727 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqj8l\" (UniqueName: \"kubernetes.io/projected/10049121-fd50-4354-bb9f-9ef573352e55-kube-api-access-hqj8l\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930754 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930785 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.930907 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.954756 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d959ed97-3c6f-4503-864e-57104658b927" (UID: "d959ed97-3c6f-4503-864e-57104658b927"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:31 crc kubenswrapper[4585]: I0215 17:21:31.973847 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d959ed97-3c6f-4503-864e-57104658b927-kube-api-access-dp2rv" (OuterVolumeSpecName: "kube-api-access-dp2rv") pod "d959ed97-3c6f-4503-864e-57104658b927" (UID: "d959ed97-3c6f-4503-864e-57104658b927"). InnerVolumeSpecName "kube-api-access-dp2rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.017287 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d959ed97-3c6f-4503-864e-57104658b927" (UID: "d959ed97-3c6f-4503-864e-57104658b927"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.022137 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-config-data" (OuterVolumeSpecName: "config-data") pod "d959ed97-3c6f-4503-864e-57104658b927" (UID: "d959ed97-3c6f-4503-864e-57104658b927"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032368 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-config\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032418 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032476 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqj8l\" (UniqueName: \"kubernetes.io/projected/10049121-fd50-4354-bb9f-9ef573352e55-kube-api-access-hqj8l\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032500 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032524 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032610 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032675 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp2rv\" (UniqueName: \"kubernetes.io/projected/d959ed97-3c6f-4503-864e-57104658b927-kube-api-access-dp2rv\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032688 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032698 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.032707 4585 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d959ed97-3c6f-4503-864e-57104658b927-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.033283 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.033285 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.033848 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-config\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.034050 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.034308 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.050024 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqj8l\" (UniqueName: \"kubernetes.io/projected/10049121-fd50-4354-bb9f-9ef573352e55-kube-api-access-hqj8l\") pod \"dnsmasq-dns-5c79d794d7-krgtj\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.085849 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.086137 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.191639 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.429838 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7nlr5" event={"ID":"d959ed97-3c6f-4503-864e-57104658b927","Type":"ContainerDied","Data":"6a32a2a9550c52c12ae604365c7891563f9a2a57e1d750924e8f01196c4f0c90"} Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.430101 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a32a2a9550c52c12ae604365c7891563f9a2a57e1d750924e8f01196c4f0c90" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.430235 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7nlr5" Feb 15 17:21:32 crc kubenswrapper[4585]: I0215 17:21:32.763214 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-krgtj"] Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.022078 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-krgtj"] Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.082641 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-jz6b2"] Feb 15 17:21:33 crc kubenswrapper[4585]: E0215 17:21:33.083067 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d959ed97-3c6f-4503-864e-57104658b927" containerName="glance-db-sync" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.083083 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d959ed97-3c6f-4503-864e-57104658b927" containerName="glance-db-sync" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.083312 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d959ed97-3c6f-4503-864e-57104658b927" containerName="glance-db-sync" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.084274 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.107231 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-jz6b2"] Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.146766 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kwgww" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerName="registry-server" probeResult="failure" output=< Feb 15 17:21:33 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:21:33 crc kubenswrapper[4585]: > Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.276141 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.276195 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c4db\" (UniqueName: \"kubernetes.io/projected/aee7bb0b-d27a-45c0-b514-a29314db0609-kube-api-access-8c4db\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.276216 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.276238 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.276278 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.276302 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-config\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.377403 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.377446 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c4db\" (UniqueName: \"kubernetes.io/projected/aee7bb0b-d27a-45c0-b514-a29314db0609-kube-api-access-8c4db\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.377467 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.377488 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.377524 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.377545 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-config\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.378335 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.378490 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.379357 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-config\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.379387 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.379442 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.397290 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c4db\" (UniqueName: \"kubernetes.io/projected/aee7bb0b-d27a-45c0-b514-a29314db0609-kube-api-access-8c4db\") pod \"dnsmasq-dns-5f59b8f679-jz6b2\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.418437 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.439499 4585 generic.go:334] "Generic (PLEG): container finished" podID="10049121-fd50-4354-bb9f-9ef573352e55" containerID="6956a6f11dfc0a632de1ebc71cef8f3b09fb53f0df0c60f652f4726542cd846f" exitCode=0 Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.439542 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" event={"ID":"10049121-fd50-4354-bb9f-9ef573352e55","Type":"ContainerDied","Data":"6956a6f11dfc0a632de1ebc71cef8f3b09fb53f0df0c60f652f4726542cd846f"} Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.439570 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" event={"ID":"10049121-fd50-4354-bb9f-9ef573352e55","Type":"ContainerStarted","Data":"8593c13b75b1d5fbb25d101b3f8217aad23ef76eb902f8ea1c38aec7ad2a5215"} Feb 15 17:21:33 crc kubenswrapper[4585]: I0215 17:21:33.942868 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.086884 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-jz6b2"] Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.096562 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-svc\") pod \"10049121-fd50-4354-bb9f-9ef573352e55\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.096698 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-swift-storage-0\") pod \"10049121-fd50-4354-bb9f-9ef573352e55\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.096739 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-sb\") pod \"10049121-fd50-4354-bb9f-9ef573352e55\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.096770 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-nb\") pod \"10049121-fd50-4354-bb9f-9ef573352e55\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.096884 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqj8l\" (UniqueName: \"kubernetes.io/projected/10049121-fd50-4354-bb9f-9ef573352e55-kube-api-access-hqj8l\") pod \"10049121-fd50-4354-bb9f-9ef573352e55\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.096913 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-config\") pod \"10049121-fd50-4354-bb9f-9ef573352e55\" (UID: \"10049121-fd50-4354-bb9f-9ef573352e55\") " Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.125879 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10049121-fd50-4354-bb9f-9ef573352e55-kube-api-access-hqj8l" (OuterVolumeSpecName: "kube-api-access-hqj8l") pod "10049121-fd50-4354-bb9f-9ef573352e55" (UID: "10049121-fd50-4354-bb9f-9ef573352e55"). InnerVolumeSpecName "kube-api-access-hqj8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.128138 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10049121-fd50-4354-bb9f-9ef573352e55" (UID: "10049121-fd50-4354-bb9f-9ef573352e55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.128279 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10049121-fd50-4354-bb9f-9ef573352e55" (UID: "10049121-fd50-4354-bb9f-9ef573352e55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.148636 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-config" (OuterVolumeSpecName: "config") pod "10049121-fd50-4354-bb9f-9ef573352e55" (UID: "10049121-fd50-4354-bb9f-9ef573352e55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.151527 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "10049121-fd50-4354-bb9f-9ef573352e55" (UID: "10049121-fd50-4354-bb9f-9ef573352e55"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.153885 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10049121-fd50-4354-bb9f-9ef573352e55" (UID: "10049121-fd50-4354-bb9f-9ef573352e55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.199341 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqj8l\" (UniqueName: \"kubernetes.io/projected/10049121-fd50-4354-bb9f-9ef573352e55-kube-api-access-hqj8l\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.199387 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.199396 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.199404 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.199416 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.199424 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10049121-fd50-4354-bb9f-9ef573352e55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.453423 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" event={"ID":"10049121-fd50-4354-bb9f-9ef573352e55","Type":"ContainerDied","Data":"8593c13b75b1d5fbb25d101b3f8217aad23ef76eb902f8ea1c38aec7ad2a5215"} Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.453435 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-krgtj" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.453482 4585 scope.go:117] "RemoveContainer" containerID="6956a6f11dfc0a632de1ebc71cef8f3b09fb53f0df0c60f652f4726542cd846f" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.462503 4585 generic.go:334] "Generic (PLEG): container finished" podID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerID="17c206d14c9c71cd653f5060958aad40251cddde0bd091ded7f4f4ff8c3fc7fe" exitCode=0 Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.462556 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" event={"ID":"aee7bb0b-d27a-45c0-b514-a29314db0609","Type":"ContainerDied","Data":"17c206d14c9c71cd653f5060958aad40251cddde0bd091ded7f4f4ff8c3fc7fe"} Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.462588 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" event={"ID":"aee7bb0b-d27a-45c0-b514-a29314db0609","Type":"ContainerStarted","Data":"e4d8ce6630e05eb4103ab41921e425ef6c837ee2e9deff91eece58f4f473331d"} Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.524653 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-krgtj"] Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.532238 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-krgtj"] Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.851420 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10049121-fd50-4354-bb9f-9ef573352e55" path="/var/lib/kubelet/pods/10049121-fd50-4354-bb9f-9ef573352e55/volumes" Feb 15 17:21:34 crc kubenswrapper[4585]: I0215 17:21:34.899756 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.302509 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cg64k"] Feb 15 17:21:35 crc kubenswrapper[4585]: E0215 17:21:35.303097 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10049121-fd50-4354-bb9f-9ef573352e55" containerName="init" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.303116 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="10049121-fd50-4354-bb9f-9ef573352e55" containerName="init" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.303310 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="10049121-fd50-4354-bb9f-9ef573352e55" containerName="init" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.303841 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.322243 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cg64k"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.420338 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-nkbd8"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.424039 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.431717 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2724d82d-6eac-48a5-8ea8-4008702ab558-operator-scripts\") pod \"cinder-db-create-cg64k\" (UID: \"2724d82d-6eac-48a5-8ea8-4008702ab558\") " pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.431814 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msfgq\" (UniqueName: \"kubernetes.io/projected/2724d82d-6eac-48a5-8ea8-4008702ab558-kube-api-access-msfgq\") pod \"cinder-db-create-cg64k\" (UID: \"2724d82d-6eac-48a5-8ea8-4008702ab558\") " pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.440970 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nkbd8"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.527523 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" event={"ID":"aee7bb0b-d27a-45c0-b514-a29314db0609","Type":"ContainerStarted","Data":"91d06bbff1d6ef96ab116258859bb62b9d5af3d2dc475f9da4beb0f557e51a91"} Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.527559 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.534154 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2724d82d-6eac-48a5-8ea8-4008702ab558-operator-scripts\") pod \"cinder-db-create-cg64k\" (UID: \"2724d82d-6eac-48a5-8ea8-4008702ab558\") " pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.534196 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppv29\" (UniqueName: \"kubernetes.io/projected/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-kube-api-access-ppv29\") pod \"barbican-db-create-nkbd8\" (UID: \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\") " pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.534257 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-operator-scripts\") pod \"barbican-db-create-nkbd8\" (UID: \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\") " pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.534292 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msfgq\" (UniqueName: \"kubernetes.io/projected/2724d82d-6eac-48a5-8ea8-4008702ab558-kube-api-access-msfgq\") pod \"cinder-db-create-cg64k\" (UID: \"2724d82d-6eac-48a5-8ea8-4008702ab558\") " pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.535072 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2724d82d-6eac-48a5-8ea8-4008702ab558-operator-scripts\") pod \"cinder-db-create-cg64k\" (UID: \"2724d82d-6eac-48a5-8ea8-4008702ab558\") " pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.539294 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1cf8-account-create-update-d6lrz"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.548333 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.550547 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1cf8-account-create-update-d6lrz"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.551893 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.552065 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" podStartSLOduration=2.5520522249999997 podStartE2EDuration="2.552052225s" podCreationTimestamp="2026-02-15 17:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:35.545091494 +0000 UTC m=+951.488499616" watchObservedRunningTime="2026-02-15 17:21:35.552052225 +0000 UTC m=+951.495460357" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.581921 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msfgq\" (UniqueName: \"kubernetes.io/projected/2724d82d-6eac-48a5-8ea8-4008702ab558-kube-api-access-msfgq\") pod \"cinder-db-create-cg64k\" (UID: \"2724d82d-6eac-48a5-8ea8-4008702ab558\") " pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.618170 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.635693 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppv29\" (UniqueName: \"kubernetes.io/projected/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-kube-api-access-ppv29\") pod \"barbican-db-create-nkbd8\" (UID: \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\") " pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.635827 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-operator-scripts\") pod \"barbican-db-create-nkbd8\" (UID: \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\") " pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.636935 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-operator-scripts\") pod \"barbican-db-create-nkbd8\" (UID: \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\") " pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.695276 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppv29\" (UniqueName: \"kubernetes.io/projected/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-kube-api-access-ppv29\") pod \"barbican-db-create-nkbd8\" (UID: \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\") " pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.699912 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e401-account-create-update-4qk4n"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.710587 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.714101 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.731836 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e401-account-create-update-4qk4n"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.741223 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zv6\" (UniqueName: \"kubernetes.io/projected/38b7dbb3-1a0e-47a6-b248-039a22229706-kube-api-access-t2zv6\") pod \"barbican-1cf8-account-create-update-d6lrz\" (UID: \"38b7dbb3-1a0e-47a6-b248-039a22229706\") " pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.741306 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38b7dbb3-1a0e-47a6-b248-039a22229706-operator-scripts\") pod \"barbican-1cf8-account-create-update-d6lrz\" (UID: \"38b7dbb3-1a0e-47a6-b248-039a22229706\") " pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.758983 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.843153 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zv6\" (UniqueName: \"kubernetes.io/projected/38b7dbb3-1a0e-47a6-b248-039a22229706-kube-api-access-t2zv6\") pod \"barbican-1cf8-account-create-update-d6lrz\" (UID: \"38b7dbb3-1a0e-47a6-b248-039a22229706\") " pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.843472 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-operator-scripts\") pod \"cinder-e401-account-create-update-4qk4n\" (UID: \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\") " pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.843518 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38b7dbb3-1a0e-47a6-b248-039a22229706-operator-scripts\") pod \"barbican-1cf8-account-create-update-d6lrz\" (UID: \"38b7dbb3-1a0e-47a6-b248-039a22229706\") " pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.843573 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmlh9\" (UniqueName: \"kubernetes.io/projected/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-kube-api-access-rmlh9\") pod \"cinder-e401-account-create-update-4qk4n\" (UID: \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\") " pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.844928 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38b7dbb3-1a0e-47a6-b248-039a22229706-operator-scripts\") pod \"barbican-1cf8-account-create-update-d6lrz\" (UID: \"38b7dbb3-1a0e-47a6-b248-039a22229706\") " pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.855636 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xdr6c"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.856791 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.859411 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xdr6c"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.871122 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zv6\" (UniqueName: \"kubernetes.io/projected/38b7dbb3-1a0e-47a6-b248-039a22229706-kube-api-access-t2zv6\") pod \"barbican-1cf8-account-create-update-d6lrz\" (UID: \"38b7dbb3-1a0e-47a6-b248-039a22229706\") " pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.944245 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmlh9\" (UniqueName: \"kubernetes.io/projected/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-kube-api-access-rmlh9\") pod \"cinder-e401-account-create-update-4qk4n\" (UID: \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\") " pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.944299 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-operator-scripts\") pod \"neutron-db-create-xdr6c\" (UID: \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\") " pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.944366 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-operator-scripts\") pod \"cinder-e401-account-create-update-4qk4n\" (UID: \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\") " pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.944409 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm2wr\" (UniqueName: \"kubernetes.io/projected/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-kube-api-access-sm2wr\") pod \"neutron-db-create-xdr6c\" (UID: \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\") " pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.945237 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-operator-scripts\") pod \"cinder-e401-account-create-update-4qk4n\" (UID: \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\") " pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.974022 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-da70-account-create-update-gvms7"] Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.979533 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.990548 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 15 17:21:35 crc kubenswrapper[4585]: I0215 17:21:35.998437 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmlh9\" (UniqueName: \"kubernetes.io/projected/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-kube-api-access-rmlh9\") pod \"cinder-e401-account-create-update-4qk4n\" (UID: \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\") " pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.015236 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-da70-account-create-update-gvms7"] Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.044779 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-cg995"] Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.045863 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.049131 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm2wr\" (UniqueName: \"kubernetes.io/projected/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-kube-api-access-sm2wr\") pod \"neutron-db-create-xdr6c\" (UID: \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\") " pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.049186 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792430d2-7a82-48c8-b091-fa2875ded5e8-operator-scripts\") pod \"neutron-da70-account-create-update-gvms7\" (UID: \"792430d2-7a82-48c8-b091-fa2875ded5e8\") " pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.049260 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-config-data\") pod \"keystone-db-sync-cg995\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.049285 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8z8j\" (UniqueName: \"kubernetes.io/projected/5f71e9d8-c516-41bf-89de-ddd7d51519f6-kube-api-access-n8z8j\") pod \"keystone-db-sync-cg995\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.049315 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-operator-scripts\") pod \"neutron-db-create-xdr6c\" (UID: \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\") " pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.049357 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs7b9\" (UniqueName: \"kubernetes.io/projected/792430d2-7a82-48c8-b091-fa2875ded5e8-kube-api-access-bs7b9\") pod \"neutron-da70-account-create-update-gvms7\" (UID: \"792430d2-7a82-48c8-b091-fa2875ded5e8\") " pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.049421 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-combined-ca-bundle\") pod \"keystone-db-sync-cg995\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.050387 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6w8mf" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.050527 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.051316 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-operator-scripts\") pod \"neutron-db-create-xdr6c\" (UID: \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\") " pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.057757 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.073975 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.101565 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.107169 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm2wr\" (UniqueName: \"kubernetes.io/projected/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-kube-api-access-sm2wr\") pod \"neutron-db-create-xdr6c\" (UID: \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\") " pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.131699 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cg995"] Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.151199 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792430d2-7a82-48c8-b091-fa2875ded5e8-operator-scripts\") pod \"neutron-da70-account-create-update-gvms7\" (UID: \"792430d2-7a82-48c8-b091-fa2875ded5e8\") " pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.151524 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-config-data\") pod \"keystone-db-sync-cg995\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.151547 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8z8j\" (UniqueName: \"kubernetes.io/projected/5f71e9d8-c516-41bf-89de-ddd7d51519f6-kube-api-access-n8z8j\") pod \"keystone-db-sync-cg995\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.151585 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs7b9\" (UniqueName: \"kubernetes.io/projected/792430d2-7a82-48c8-b091-fa2875ded5e8-kube-api-access-bs7b9\") pod \"neutron-da70-account-create-update-gvms7\" (UID: \"792430d2-7a82-48c8-b091-fa2875ded5e8\") " pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.151642 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-combined-ca-bundle\") pod \"keystone-db-sync-cg995\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.154332 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792430d2-7a82-48c8-b091-fa2875ded5e8-operator-scripts\") pod \"neutron-da70-account-create-update-gvms7\" (UID: \"792430d2-7a82-48c8-b091-fa2875ded5e8\") " pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.167661 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-combined-ca-bundle\") pod \"keystone-db-sync-cg995\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.182308 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-config-data\") pod \"keystone-db-sync-cg995\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.182734 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.198276 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs7b9\" (UniqueName: \"kubernetes.io/projected/792430d2-7a82-48c8-b091-fa2875ded5e8-kube-api-access-bs7b9\") pod \"neutron-da70-account-create-update-gvms7\" (UID: \"792430d2-7a82-48c8-b091-fa2875ded5e8\") " pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.203488 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8z8j\" (UniqueName: \"kubernetes.io/projected/5f71e9d8-c516-41bf-89de-ddd7d51519f6-kube-api-access-n8z8j\") pod \"keystone-db-sync-cg995\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.225538 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.315547 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.427588 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.697392 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cg64k"] Feb 15 17:21:36 crc kubenswrapper[4585]: I0215 17:21:36.737501 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-nkbd8"] Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.096519 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e401-account-create-update-4qk4n"] Feb 15 17:21:37 crc kubenswrapper[4585]: W0215 17:21:37.121389 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a35fd8_ae4c_4c30_a41b_0b1413b82ef4.slice/crio-e726c566dcf1f8e65de39297ba47beb3993857cbff7d4e408f4fff19a27cf3db WatchSource:0}: Error finding container e726c566dcf1f8e65de39297ba47beb3993857cbff7d4e408f4fff19a27cf3db: Status 404 returned error can't find the container with id e726c566dcf1f8e65de39297ba47beb3993857cbff7d4e408f4fff19a27cf3db Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.439145 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1cf8-account-create-update-d6lrz"] Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.469555 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xdr6c"] Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.564622 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-da70-account-create-update-gvms7"] Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.594783 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da70-account-create-update-gvms7" event={"ID":"792430d2-7a82-48c8-b091-fa2875ded5e8","Type":"ContainerStarted","Data":"d2886a84fbac16f524c36eda301c3d0bc2bcbc3841d260b8e3b33402230d1ed1"} Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.595773 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cg64k" event={"ID":"2724d82d-6eac-48a5-8ea8-4008702ab558","Type":"ContainerStarted","Data":"fd5dcb6d224d611aa5492d2aa912076a195cb4261bcae88d4d6cf47ffad335ac"} Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.595797 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cg64k" event={"ID":"2724d82d-6eac-48a5-8ea8-4008702ab558","Type":"ContainerStarted","Data":"1579fc74a3e664631ff1f0407d1736df7174bb22329b9f361fdf42e4762a808b"} Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.598053 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e401-account-create-update-4qk4n" event={"ID":"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4","Type":"ContainerStarted","Data":"61818f492a085cfc2f7bae53c0cae6feee90e669bbabf656eef07532e41f8901"} Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.598077 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e401-account-create-update-4qk4n" event={"ID":"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4","Type":"ContainerStarted","Data":"e726c566dcf1f8e65de39297ba47beb3993857cbff7d4e408f4fff19a27cf3db"} Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.599239 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1cf8-account-create-update-d6lrz" event={"ID":"38b7dbb3-1a0e-47a6-b248-039a22229706","Type":"ContainerStarted","Data":"7c964b4b683a76367c49e0bee5c8a41f37e254001f963168fc0539224999344b"} Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.601036 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xdr6c" event={"ID":"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476","Type":"ContainerStarted","Data":"aabfde1b411ce97be69615fd201b8d9ab511416523034e07259e780b18259812"} Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.608471 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nkbd8" event={"ID":"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad","Type":"ContainerStarted","Data":"5868f8510d0aae4b161978a06a88f463f53b293dc09a0fd1522fa53227a705d8"} Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.608688 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nkbd8" event={"ID":"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad","Type":"ContainerStarted","Data":"b3133b0797e90234a83a252d40bbcb8050eee63556b1f81a264129807b8a5640"} Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.631230 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-cg64k" podStartSLOduration=2.6312138640000002 podStartE2EDuration="2.631213864s" podCreationTimestamp="2026-02-15 17:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:37.628459109 +0000 UTC m=+953.571867241" watchObservedRunningTime="2026-02-15 17:21:37.631213864 +0000 UTC m=+953.574621996" Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.653039 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cg995"] Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.688063 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e401-account-create-update-4qk4n" podStartSLOduration=2.688047613 podStartE2EDuration="2.688047613s" podCreationTimestamp="2026-02-15 17:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:37.687302263 +0000 UTC m=+953.630710395" watchObservedRunningTime="2026-02-15 17:21:37.688047613 +0000 UTC m=+953.631455735" Feb 15 17:21:37 crc kubenswrapper[4585]: I0215 17:21:37.689478 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-nkbd8" podStartSLOduration=2.6894730730000003 podStartE2EDuration="2.689473073s" podCreationTimestamp="2026-02-15 17:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:37.669615891 +0000 UTC m=+953.613024023" watchObservedRunningTime="2026-02-15 17:21:37.689473073 +0000 UTC m=+953.632881195" Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.618038 4585 generic.go:334] "Generic (PLEG): container finished" podID="792430d2-7a82-48c8-b091-fa2875ded5e8" containerID="506451b50970b852b7fc0b12b953d97fe283737f32279dbe24de47e4899e1b1e" exitCode=0 Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.618189 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da70-account-create-update-gvms7" event={"ID":"792430d2-7a82-48c8-b091-fa2875ded5e8","Type":"ContainerDied","Data":"506451b50970b852b7fc0b12b953d97fe283737f32279dbe24de47e4899e1b1e"} Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.620058 4585 generic.go:334] "Generic (PLEG): container finished" podID="2724d82d-6eac-48a5-8ea8-4008702ab558" containerID="fd5dcb6d224d611aa5492d2aa912076a195cb4261bcae88d4d6cf47ffad335ac" exitCode=0 Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.620103 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cg64k" event={"ID":"2724d82d-6eac-48a5-8ea8-4008702ab558","Type":"ContainerDied","Data":"fd5dcb6d224d611aa5492d2aa912076a195cb4261bcae88d4d6cf47ffad335ac"} Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.622263 4585 generic.go:334] "Generic (PLEG): container finished" podID="49a35fd8-ae4c-4c30-a41b-0b1413b82ef4" containerID="61818f492a085cfc2f7bae53c0cae6feee90e669bbabf656eef07532e41f8901" exitCode=0 Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.622298 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e401-account-create-update-4qk4n" event={"ID":"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4","Type":"ContainerDied","Data":"61818f492a085cfc2f7bae53c0cae6feee90e669bbabf656eef07532e41f8901"} Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.623939 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cg995" event={"ID":"5f71e9d8-c516-41bf-89de-ddd7d51519f6","Type":"ContainerStarted","Data":"fd6b1f09a9de9e0e0708c8b40ce8ffb142987d92b3c9d3959f91fb0e8b945ee0"} Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.625749 4585 generic.go:334] "Generic (PLEG): container finished" podID="38b7dbb3-1a0e-47a6-b248-039a22229706" containerID="3d38d5d04daecc02733746c64928e47c31b78bf58d8e657112b54ac302d83cd1" exitCode=0 Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.625802 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1cf8-account-create-update-d6lrz" event={"ID":"38b7dbb3-1a0e-47a6-b248-039a22229706","Type":"ContainerDied","Data":"3d38d5d04daecc02733746c64928e47c31b78bf58d8e657112b54ac302d83cd1"} Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.628271 4585 generic.go:334] "Generic (PLEG): container finished" podID="f3f97f63-b51e-4adf-8ec3-1d2aefc4c476" containerID="8e4844b141caedf8bcbbb89a0019b89351c81bc8f01dc6a6c17894a433d2fc98" exitCode=0 Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.628309 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xdr6c" event={"ID":"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476","Type":"ContainerDied","Data":"8e4844b141caedf8bcbbb89a0019b89351c81bc8f01dc6a6c17894a433d2fc98"} Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.629869 4585 generic.go:334] "Generic (PLEG): container finished" podID="47f8a86e-f29d-4ab2-ae55-762f90fdd6ad" containerID="5868f8510d0aae4b161978a06a88f463f53b293dc09a0fd1522fa53227a705d8" exitCode=0 Feb 15 17:21:38 crc kubenswrapper[4585]: I0215 17:21:38.629901 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nkbd8" event={"ID":"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad","Type":"ContainerDied","Data":"5868f8510d0aae4b161978a06a88f463f53b293dc09a0fd1522fa53227a705d8"} Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.236089 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.395892 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2zv6\" (UniqueName: \"kubernetes.io/projected/38b7dbb3-1a0e-47a6-b248-039a22229706-kube-api-access-t2zv6\") pod \"38b7dbb3-1a0e-47a6-b248-039a22229706\" (UID: \"38b7dbb3-1a0e-47a6-b248-039a22229706\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.396446 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38b7dbb3-1a0e-47a6-b248-039a22229706-operator-scripts\") pod \"38b7dbb3-1a0e-47a6-b248-039a22229706\" (UID: \"38b7dbb3-1a0e-47a6-b248-039a22229706\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.396882 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b7dbb3-1a0e-47a6-b248-039a22229706-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38b7dbb3-1a0e-47a6-b248-039a22229706" (UID: "38b7dbb3-1a0e-47a6-b248-039a22229706"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.401452 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b7dbb3-1a0e-47a6-b248-039a22229706-kube-api-access-t2zv6" (OuterVolumeSpecName: "kube-api-access-t2zv6") pod "38b7dbb3-1a0e-47a6-b248-039a22229706" (UID: "38b7dbb3-1a0e-47a6-b248-039a22229706"). InnerVolumeSpecName "kube-api-access-t2zv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.448313 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.459711 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.458497 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.490860 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.491391 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.499070 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2zv6\" (UniqueName: \"kubernetes.io/projected/38b7dbb3-1a0e-47a6-b248-039a22229706-kube-api-access-t2zv6\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.501259 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38b7dbb3-1a0e-47a6-b248-039a22229706-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602208 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2724d82d-6eac-48a5-8ea8-4008702ab558-operator-scripts\") pod \"2724d82d-6eac-48a5-8ea8-4008702ab558\" (UID: \"2724d82d-6eac-48a5-8ea8-4008702ab558\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602274 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm2wr\" (UniqueName: \"kubernetes.io/projected/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-kube-api-access-sm2wr\") pod \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\" (UID: \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602355 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-operator-scripts\") pod \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\" (UID: \"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602378 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-operator-scripts\") pod \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\" (UID: \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602424 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmlh9\" (UniqueName: \"kubernetes.io/projected/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-kube-api-access-rmlh9\") pod \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\" (UID: \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602461 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msfgq\" (UniqueName: \"kubernetes.io/projected/2724d82d-6eac-48a5-8ea8-4008702ab558-kube-api-access-msfgq\") pod \"2724d82d-6eac-48a5-8ea8-4008702ab558\" (UID: \"2724d82d-6eac-48a5-8ea8-4008702ab558\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602495 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-operator-scripts\") pod \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\" (UID: \"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602526 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792430d2-7a82-48c8-b091-fa2875ded5e8-operator-scripts\") pod \"792430d2-7a82-48c8-b091-fa2875ded5e8\" (UID: \"792430d2-7a82-48c8-b091-fa2875ded5e8\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602567 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs7b9\" (UniqueName: \"kubernetes.io/projected/792430d2-7a82-48c8-b091-fa2875ded5e8-kube-api-access-bs7b9\") pod \"792430d2-7a82-48c8-b091-fa2875ded5e8\" (UID: \"792430d2-7a82-48c8-b091-fa2875ded5e8\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602590 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppv29\" (UniqueName: \"kubernetes.io/projected/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-kube-api-access-ppv29\") pod \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\" (UID: \"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad\") " Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.602747 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2724d82d-6eac-48a5-8ea8-4008702ab558-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2724d82d-6eac-48a5-8ea8-4008702ab558" (UID: "2724d82d-6eac-48a5-8ea8-4008702ab558"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.603322 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/792430d2-7a82-48c8-b091-fa2875ded5e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "792430d2-7a82-48c8-b091-fa2875ded5e8" (UID: "792430d2-7a82-48c8-b091-fa2875ded5e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.603627 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47f8a86e-f29d-4ab2-ae55-762f90fdd6ad" (UID: "47f8a86e-f29d-4ab2-ae55-762f90fdd6ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.603938 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49a35fd8-ae4c-4c30-a41b-0b1413b82ef4" (UID: "49a35fd8-ae4c-4c30-a41b-0b1413b82ef4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.605089 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2724d82d-6eac-48a5-8ea8-4008702ab558-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.605912 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3f97f63-b51e-4adf-8ec3-1d2aefc4c476" (UID: "f3f97f63-b51e-4adf-8ec3-1d2aefc4c476"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.606142 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-kube-api-access-sm2wr" (OuterVolumeSpecName: "kube-api-access-sm2wr") pod "f3f97f63-b51e-4adf-8ec3-1d2aefc4c476" (UID: "f3f97f63-b51e-4adf-8ec3-1d2aefc4c476"). InnerVolumeSpecName "kube-api-access-sm2wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.606194 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792430d2-7a82-48c8-b091-fa2875ded5e8-kube-api-access-bs7b9" (OuterVolumeSpecName: "kube-api-access-bs7b9") pod "792430d2-7a82-48c8-b091-fa2875ded5e8" (UID: "792430d2-7a82-48c8-b091-fa2875ded5e8"). InnerVolumeSpecName "kube-api-access-bs7b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.606324 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-kube-api-access-ppv29" (OuterVolumeSpecName: "kube-api-access-ppv29") pod "47f8a86e-f29d-4ab2-ae55-762f90fdd6ad" (UID: "47f8a86e-f29d-4ab2-ae55-762f90fdd6ad"). InnerVolumeSpecName "kube-api-access-ppv29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.609328 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2724d82d-6eac-48a5-8ea8-4008702ab558-kube-api-access-msfgq" (OuterVolumeSpecName: "kube-api-access-msfgq") pod "2724d82d-6eac-48a5-8ea8-4008702ab558" (UID: "2724d82d-6eac-48a5-8ea8-4008702ab558"). InnerVolumeSpecName "kube-api-access-msfgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.611039 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-kube-api-access-rmlh9" (OuterVolumeSpecName: "kube-api-access-rmlh9") pod "49a35fd8-ae4c-4c30-a41b-0b1413b82ef4" (UID: "49a35fd8-ae4c-4c30-a41b-0b1413b82ef4"). InnerVolumeSpecName "kube-api-access-rmlh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.665986 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1cf8-account-create-update-d6lrz" event={"ID":"38b7dbb3-1a0e-47a6-b248-039a22229706","Type":"ContainerDied","Data":"7c964b4b683a76367c49e0bee5c8a41f37e254001f963168fc0539224999344b"} Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.666321 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c964b4b683a76367c49e0bee5c8a41f37e254001f963168fc0539224999344b" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.666403 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1cf8-account-create-update-d6lrz" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.672256 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xdr6c" event={"ID":"f3f97f63-b51e-4adf-8ec3-1d2aefc4c476","Type":"ContainerDied","Data":"aabfde1b411ce97be69615fd201b8d9ab511416523034e07259e780b18259812"} Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.672507 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aabfde1b411ce97be69615fd201b8d9ab511416523034e07259e780b18259812" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.672555 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xdr6c" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.691987 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-nkbd8" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.691981 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-nkbd8" event={"ID":"47f8a86e-f29d-4ab2-ae55-762f90fdd6ad","Type":"ContainerDied","Data":"b3133b0797e90234a83a252d40bbcb8050eee63556b1f81a264129807b8a5640"} Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.692562 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3133b0797e90234a83a252d40bbcb8050eee63556b1f81a264129807b8a5640" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.694682 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-da70-account-create-update-gvms7" event={"ID":"792430d2-7a82-48c8-b091-fa2875ded5e8","Type":"ContainerDied","Data":"d2886a84fbac16f524c36eda301c3d0bc2bcbc3841d260b8e3b33402230d1ed1"} Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.694719 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2886a84fbac16f524c36eda301c3d0bc2bcbc3841d260b8e3b33402230d1ed1" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.694773 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-da70-account-create-update-gvms7" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.700564 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cg64k" event={"ID":"2724d82d-6eac-48a5-8ea8-4008702ab558","Type":"ContainerDied","Data":"1579fc74a3e664631ff1f0407d1736df7174bb22329b9f361fdf42e4762a808b"} Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.700750 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cg64k" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.700762 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1579fc74a3e664631ff1f0407d1736df7174bb22329b9f361fdf42e4762a808b" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.702614 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e401-account-create-update-4qk4n" event={"ID":"49a35fd8-ae4c-4c30-a41b-0b1413b82ef4","Type":"ContainerDied","Data":"e726c566dcf1f8e65de39297ba47beb3993857cbff7d4e408f4fff19a27cf3db"} Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.702639 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e726c566dcf1f8e65de39297ba47beb3993857cbff7d4e408f4fff19a27cf3db" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.702676 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e401-account-create-update-4qk4n" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.707562 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm2wr\" (UniqueName: \"kubernetes.io/projected/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-kube-api-access-sm2wr\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.707584 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.707608 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.707617 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmlh9\" (UniqueName: \"kubernetes.io/projected/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-kube-api-access-rmlh9\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.707626 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msfgq\" (UniqueName: \"kubernetes.io/projected/2724d82d-6eac-48a5-8ea8-4008702ab558-kube-api-access-msfgq\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.707634 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.707654 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792430d2-7a82-48c8-b091-fa2875ded5e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.707664 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs7b9\" (UniqueName: \"kubernetes.io/projected/792430d2-7a82-48c8-b091-fa2875ded5e8-kube-api-access-bs7b9\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:40 crc kubenswrapper[4585]: I0215 17:21:40.707672 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppv29\" (UniqueName: \"kubernetes.io/projected/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad-kube-api-access-ppv29\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:42 crc kubenswrapper[4585]: I0215 17:21:42.145620 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:42 crc kubenswrapper[4585]: I0215 17:21:42.217266 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:42 crc kubenswrapper[4585]: I0215 17:21:42.911407 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwgww"] Feb 15 17:21:43 crc kubenswrapper[4585]: I0215 17:21:43.419779 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:21:43 crc kubenswrapper[4585]: I0215 17:21:43.600263 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8qg92"] Feb 15 17:21:43 crc kubenswrapper[4585]: I0215 17:21:43.600764 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" podUID="86b120f0-0567-42fa-b617-69cf1d4854c7" containerName="dnsmasq-dns" containerID="cri-o://445760de021578cfbcb54d581b7cb9c8735b50399d6b0c66343381a2dff920a7" gracePeriod=10 Feb 15 17:21:43 crc kubenswrapper[4585]: I0215 17:21:43.738064 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kwgww" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerName="registry-server" containerID="cri-o://249ff933754966b2c1225475204a7f5a4f7b6c40fac64bf98b56e5c20e7b128d" gracePeriod=2 Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.753264 4585 generic.go:334] "Generic (PLEG): container finished" podID="86b120f0-0567-42fa-b617-69cf1d4854c7" containerID="445760de021578cfbcb54d581b7cb9c8735b50399d6b0c66343381a2dff920a7" exitCode=0 Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.753507 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" event={"ID":"86b120f0-0567-42fa-b617-69cf1d4854c7","Type":"ContainerDied","Data":"445760de021578cfbcb54d581b7cb9c8735b50399d6b0c66343381a2dff920a7"} Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.753531 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" event={"ID":"86b120f0-0567-42fa-b617-69cf1d4854c7","Type":"ContainerDied","Data":"223babbfa2e6ad1335bc7da2d67e8034bd57318541cae878e3794930acc0f1ea"} Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.753543 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="223babbfa2e6ad1335bc7da2d67e8034bd57318541cae878e3794930acc0f1ea" Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.754583 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cg995" event={"ID":"5f71e9d8-c516-41bf-89de-ddd7d51519f6","Type":"ContainerStarted","Data":"e1adad7efb0945afc416ec66fecc4e5baf715bfe31534719b77a3ad1a0841147"} Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.767899 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.768569 4585 generic.go:334] "Generic (PLEG): container finished" podID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerID="249ff933754966b2c1225475204a7f5a4f7b6c40fac64bf98b56e5c20e7b128d" exitCode=0 Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.768644 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwgww" event={"ID":"4e9a733a-04ee-4eff-bf0a-91f112778d91","Type":"ContainerDied","Data":"249ff933754966b2c1225475204a7f5a4f7b6c40fac64bf98b56e5c20e7b128d"} Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.843223 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-cg995" podStartSLOduration=3.067430256 podStartE2EDuration="9.84320808s" podCreationTimestamp="2026-02-15 17:21:35 +0000 UTC" firstStartedPulling="2026-02-15 17:21:37.660229555 +0000 UTC m=+953.603637687" lastFinishedPulling="2026-02-15 17:21:44.436007379 +0000 UTC m=+960.379415511" observedRunningTime="2026-02-15 17:21:44.780084119 +0000 UTC m=+960.723492241" watchObservedRunningTime="2026-02-15 17:21:44.84320808 +0000 UTC m=+960.786616212" Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.854491 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.916272 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh5kw\" (UniqueName: \"kubernetes.io/projected/86b120f0-0567-42fa-b617-69cf1d4854c7-kube-api-access-vh5kw\") pod \"86b120f0-0567-42fa-b617-69cf1d4854c7\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.916409 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-nb\") pod \"86b120f0-0567-42fa-b617-69cf1d4854c7\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.916435 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-config\") pod \"86b120f0-0567-42fa-b617-69cf1d4854c7\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.916472 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-dns-svc\") pod \"86b120f0-0567-42fa-b617-69cf1d4854c7\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.916539 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-sb\") pod \"86b120f0-0567-42fa-b617-69cf1d4854c7\" (UID: \"86b120f0-0567-42fa-b617-69cf1d4854c7\") " Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.920619 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b120f0-0567-42fa-b617-69cf1d4854c7-kube-api-access-vh5kw" (OuterVolumeSpecName: "kube-api-access-vh5kw") pod "86b120f0-0567-42fa-b617-69cf1d4854c7" (UID: "86b120f0-0567-42fa-b617-69cf1d4854c7"). InnerVolumeSpecName "kube-api-access-vh5kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.966187 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86b120f0-0567-42fa-b617-69cf1d4854c7" (UID: "86b120f0-0567-42fa-b617-69cf1d4854c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.978736 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86b120f0-0567-42fa-b617-69cf1d4854c7" (UID: "86b120f0-0567-42fa-b617-69cf1d4854c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:44 crc kubenswrapper[4585]: I0215 17:21:44.994265 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86b120f0-0567-42fa-b617-69cf1d4854c7" (UID: "86b120f0-0567-42fa-b617-69cf1d4854c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.003246 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-config" (OuterVolumeSpecName: "config") pod "86b120f0-0567-42fa-b617-69cf1d4854c7" (UID: "86b120f0-0567-42fa-b617-69cf1d4854c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.018651 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-utilities\") pod \"4e9a733a-04ee-4eff-bf0a-91f112778d91\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.018698 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjsk2\" (UniqueName: \"kubernetes.io/projected/4e9a733a-04ee-4eff-bf0a-91f112778d91-kube-api-access-pjsk2\") pod \"4e9a733a-04ee-4eff-bf0a-91f112778d91\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.018715 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-catalog-content\") pod \"4e9a733a-04ee-4eff-bf0a-91f112778d91\" (UID: \"4e9a733a-04ee-4eff-bf0a-91f112778d91\") " Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.019151 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.019170 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.019179 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.019188 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b120f0-0567-42fa-b617-69cf1d4854c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.019196 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh5kw\" (UniqueName: \"kubernetes.io/projected/86b120f0-0567-42fa-b617-69cf1d4854c7-kube-api-access-vh5kw\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.020006 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-utilities" (OuterVolumeSpecName: "utilities") pod "4e9a733a-04ee-4eff-bf0a-91f112778d91" (UID: "4e9a733a-04ee-4eff-bf0a-91f112778d91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.022216 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9a733a-04ee-4eff-bf0a-91f112778d91-kube-api-access-pjsk2" (OuterVolumeSpecName: "kube-api-access-pjsk2") pod "4e9a733a-04ee-4eff-bf0a-91f112778d91" (UID: "4e9a733a-04ee-4eff-bf0a-91f112778d91"). InnerVolumeSpecName "kube-api-access-pjsk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.120421 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.120456 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjsk2\" (UniqueName: \"kubernetes.io/projected/4e9a733a-04ee-4eff-bf0a-91f112778d91-kube-api-access-pjsk2\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.122716 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e9a733a-04ee-4eff-bf0a-91f112778d91" (UID: "4e9a733a-04ee-4eff-bf0a-91f112778d91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.222462 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e9a733a-04ee-4eff-bf0a-91f112778d91-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.786989 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-8qg92" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.787817 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwgww" event={"ID":"4e9a733a-04ee-4eff-bf0a-91f112778d91","Type":"ContainerDied","Data":"db5e8f7d3e4b31b42a0fbe1dca9cb284675fedd07f6af8e11d2d9eb6659873e6"} Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.787874 4585 scope.go:117] "RemoveContainer" containerID="249ff933754966b2c1225475204a7f5a4f7b6c40fac64bf98b56e5c20e7b128d" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.788490 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwgww" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.844289 4585 scope.go:117] "RemoveContainer" containerID="a54af59911e03d5335a4ecf6cc48dfe7766178a3c85e9656fd5566ce551bb80c" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.855838 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8qg92"] Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.885859 4585 scope.go:117] "RemoveContainer" containerID="8038e0cf928e36207cf5c0ab2f488c3b6bd97e523c11afc4d46049a8e3584012" Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.886916 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-8qg92"] Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.921750 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwgww"] Feb 15 17:21:45 crc kubenswrapper[4585]: I0215 17:21:45.931580 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kwgww"] Feb 15 17:21:46 crc kubenswrapper[4585]: I0215 17:21:46.861135 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" path="/var/lib/kubelet/pods/4e9a733a-04ee-4eff-bf0a-91f112778d91/volumes" Feb 15 17:21:46 crc kubenswrapper[4585]: I0215 17:21:46.862515 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b120f0-0567-42fa-b617-69cf1d4854c7" path="/var/lib/kubelet/pods/86b120f0-0567-42fa-b617-69cf1d4854c7/volumes" Feb 15 17:21:47 crc kubenswrapper[4585]: I0215 17:21:47.806811 4585 generic.go:334] "Generic (PLEG): container finished" podID="5f71e9d8-c516-41bf-89de-ddd7d51519f6" containerID="e1adad7efb0945afc416ec66fecc4e5baf715bfe31534719b77a3ad1a0841147" exitCode=0 Feb 15 17:21:47 crc kubenswrapper[4585]: I0215 17:21:47.806858 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cg995" event={"ID":"5f71e9d8-c516-41bf-89de-ddd7d51519f6","Type":"ContainerDied","Data":"e1adad7efb0945afc416ec66fecc4e5baf715bfe31534719b77a3ad1a0841147"} Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.210553 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.211907 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8z8j\" (UniqueName: \"kubernetes.io/projected/5f71e9d8-c516-41bf-89de-ddd7d51519f6-kube-api-access-n8z8j\") pod \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.212035 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-combined-ca-bundle\") pod \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.212087 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-config-data\") pod \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\" (UID: \"5f71e9d8-c516-41bf-89de-ddd7d51519f6\") " Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.217867 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f71e9d8-c516-41bf-89de-ddd7d51519f6-kube-api-access-n8z8j" (OuterVolumeSpecName: "kube-api-access-n8z8j") pod "5f71e9d8-c516-41bf-89de-ddd7d51519f6" (UID: "5f71e9d8-c516-41bf-89de-ddd7d51519f6"). InnerVolumeSpecName "kube-api-access-n8z8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.257009 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f71e9d8-c516-41bf-89de-ddd7d51519f6" (UID: "5f71e9d8-c516-41bf-89de-ddd7d51519f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.280294 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-config-data" (OuterVolumeSpecName: "config-data") pod "5f71e9d8-c516-41bf-89de-ddd7d51519f6" (UID: "5f71e9d8-c516-41bf-89de-ddd7d51519f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.314333 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8z8j\" (UniqueName: \"kubernetes.io/projected/5f71e9d8-c516-41bf-89de-ddd7d51519f6-kube-api-access-n8z8j\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.314374 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.314389 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f71e9d8-c516-41bf-89de-ddd7d51519f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.837646 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cg995" event={"ID":"5f71e9d8-c516-41bf-89de-ddd7d51519f6","Type":"ContainerDied","Data":"fd6b1f09a9de9e0e0708c8b40ce8ffb142987d92b3c9d3959f91fb0e8b945ee0"} Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.837905 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd6b1f09a9de9e0e0708c8b40ce8ffb142987d92b3c9d3959f91fb0e8b945ee0" Feb 15 17:21:49 crc kubenswrapper[4585]: I0215 17:21:49.838002 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cg995" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007268 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-htmph"] Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007771 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2724d82d-6eac-48a5-8ea8-4008702ab558" containerName="mariadb-database-create" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007789 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2724d82d-6eac-48a5-8ea8-4008702ab558" containerName="mariadb-database-create" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007800 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerName="registry-server" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007806 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerName="registry-server" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007824 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f8a86e-f29d-4ab2-ae55-762f90fdd6ad" containerName="mariadb-database-create" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007830 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f8a86e-f29d-4ab2-ae55-762f90fdd6ad" containerName="mariadb-database-create" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007839 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b120f0-0567-42fa-b617-69cf1d4854c7" containerName="dnsmasq-dns" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007845 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b120f0-0567-42fa-b617-69cf1d4854c7" containerName="dnsmasq-dns" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007858 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f71e9d8-c516-41bf-89de-ddd7d51519f6" containerName="keystone-db-sync" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007864 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f71e9d8-c516-41bf-89de-ddd7d51519f6" containerName="keystone-db-sync" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007879 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerName="extract-utilities" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007886 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerName="extract-utilities" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007909 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3f97f63-b51e-4adf-8ec3-1d2aefc4c476" containerName="mariadb-database-create" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007916 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3f97f63-b51e-4adf-8ec3-1d2aefc4c476" containerName="mariadb-database-create" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007935 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a35fd8-ae4c-4c30-a41b-0b1413b82ef4" containerName="mariadb-account-create-update" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007940 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a35fd8-ae4c-4c30-a41b-0b1413b82ef4" containerName="mariadb-account-create-update" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007950 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerName="extract-content" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007956 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerName="extract-content" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007965 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b120f0-0567-42fa-b617-69cf1d4854c7" containerName="init" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007970 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b120f0-0567-42fa-b617-69cf1d4854c7" containerName="init" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.007983 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b7dbb3-1a0e-47a6-b248-039a22229706" containerName="mariadb-account-create-update" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.007989 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b7dbb3-1a0e-47a6-b248-039a22229706" containerName="mariadb-account-create-update" Feb 15 17:21:50 crc kubenswrapper[4585]: E0215 17:21:50.008002 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792430d2-7a82-48c8-b091-fa2875ded5e8" containerName="mariadb-account-create-update" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008008 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="792430d2-7a82-48c8-b091-fa2875ded5e8" containerName="mariadb-account-create-update" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008185 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3f97f63-b51e-4adf-8ec3-1d2aefc4c476" containerName="mariadb-database-create" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008203 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f8a86e-f29d-4ab2-ae55-762f90fdd6ad" containerName="mariadb-database-create" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008214 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="792430d2-7a82-48c8-b091-fa2875ded5e8" containerName="mariadb-account-create-update" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008225 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f71e9d8-c516-41bf-89de-ddd7d51519f6" containerName="keystone-db-sync" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008238 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9a733a-04ee-4eff-bf0a-91f112778d91" containerName="registry-server" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008243 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b7dbb3-1a0e-47a6-b248-039a22229706" containerName="mariadb-account-create-update" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008253 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a35fd8-ae4c-4c30-a41b-0b1413b82ef4" containerName="mariadb-account-create-update" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008262 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="2724d82d-6eac-48a5-8ea8-4008702ab558" containerName="mariadb-database-create" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.008272 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b120f0-0567-42fa-b617-69cf1d4854c7" containerName="dnsmasq-dns" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.009287 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.025109 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gp6zx"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.031273 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.037347 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.037517 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6w8mf" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.040152 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.040330 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.040856 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.065313 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gp6zx"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.131764 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-htmph"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.136710 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjsf\" (UniqueName: \"kubernetes.io/projected/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-kube-api-access-cdjsf\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.136757 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-config-data\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.136781 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.136805 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.136827 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.136848 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-scripts\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.136887 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-fernet-keys\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.136994 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.137022 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-config\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.137059 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-combined-ca-bundle\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.137089 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q25f8\" (UniqueName: \"kubernetes.io/projected/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-kube-api-access-q25f8\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.137108 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-credential-keys\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.238989 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-fernet-keys\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239048 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239080 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-config\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239117 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-combined-ca-bundle\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239150 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q25f8\" (UniqueName: \"kubernetes.io/projected/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-kube-api-access-q25f8\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239168 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-credential-keys\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239188 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjsf\" (UniqueName: \"kubernetes.io/projected/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-kube-api-access-cdjsf\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239205 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-config-data\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239222 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239243 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239261 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.239278 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-scripts\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.240158 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.242416 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-config\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.247962 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.249409 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.249965 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.251894 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-credential-keys\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.252066 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-config-data\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.253682 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-scripts\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.263505 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-fernet-keys\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.274417 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-combined-ca-bundle\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.330659 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q25f8\" (UniqueName: \"kubernetes.io/projected/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-kube-api-access-q25f8\") pod \"dnsmasq-dns-bbf5cc879-htmph\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.333671 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjsf\" (UniqueName: \"kubernetes.io/projected/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-kube-api-access-cdjsf\") pod \"keystone-bootstrap-gp6zx\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.357187 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.362777 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-4vdfm"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.364043 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.386846 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6948db496c-b4lj8"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.388684 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.404153 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.404413 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tlwjs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.404507 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.410304 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.411712 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mx9kt" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.411977 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.412197 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.420544 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4vdfm"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.446515 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6948db496c-b4lj8"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.477176 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-n4nzj"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.478400 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.494887 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.495259 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mx4pk" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.495434 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.532958 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-n4nzj"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.564167 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3f0b27-f6d2-44ef-8e81-4052749cd681-logs\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.567888 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc4gq\" (UniqueName: \"kubernetes.io/projected/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-kube-api-access-jc4gq\") pod \"neutron-db-sync-n4nzj\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.568022 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-scripts\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.568096 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-db-sync-config-data\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.568178 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-etc-machine-id\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.568269 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-config-data\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.568349 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bp4j\" (UniqueName: \"kubernetes.io/projected/5d3f0b27-f6d2-44ef-8e81-4052749cd681-kube-api-access-4bp4j\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.568418 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-combined-ca-bundle\") pod \"neutron-db-sync-n4nzj\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.568632 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k45br\" (UniqueName: \"kubernetes.io/projected/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-kube-api-access-k45br\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.579592 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-config-data\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.580023 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-config\") pod \"neutron-db-sync-n4nzj\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.580110 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-scripts\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.580176 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-combined-ca-bundle\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.580257 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3f0b27-f6d2-44ef-8e81-4052749cd681-horizon-secret-key\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.570001 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.607736 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.613026 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.613189 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.626941 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.644618 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-htmph"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.672246 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.689465 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-scripts\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.689587 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-db-sync-config-data\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.689684 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-etc-machine-id\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.689751 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-config-data\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.689829 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.689895 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-combined-ca-bundle\") pod \"neutron-db-sync-n4nzj\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.689970 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bp4j\" (UniqueName: \"kubernetes.io/projected/5d3f0b27-f6d2-44ef-8e81-4052749cd681-kube-api-access-4bp4j\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690048 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-run-httpd\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690146 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-log-httpd\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690232 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-897tw\" (UniqueName: \"kubernetes.io/projected/624871de-b62e-4eae-a220-a5d34995919d-kube-api-access-897tw\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690299 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k45br\" (UniqueName: \"kubernetes.io/projected/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-kube-api-access-k45br\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690373 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-config-data\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690465 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-scripts\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690540 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-config\") pod \"neutron-db-sync-n4nzj\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690616 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-scripts\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690691 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-combined-ca-bundle\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690763 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3f0b27-f6d2-44ef-8e81-4052749cd681-horizon-secret-key\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690838 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690911 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-config-data\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.690991 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3f0b27-f6d2-44ef-8e81-4052749cd681-logs\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.691073 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc4gq\" (UniqueName: \"kubernetes.io/projected/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-kube-api-access-jc4gq\") pod \"neutron-db-sync-n4nzj\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.693519 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-scripts\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.695012 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3f0b27-f6d2-44ef-8e81-4052749cd681-logs\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.697011 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-etc-machine-id\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.701745 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-config-data\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.704124 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-scripts\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.704845 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-db-sync-config-data\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.718271 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-config\") pod \"neutron-db-sync-n4nzj\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.719153 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3f0b27-f6d2-44ef-8e81-4052749cd681-horizon-secret-key\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.719248 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-config-data\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.720069 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-combined-ca-bundle\") pod \"neutron-db-sync-n4nzj\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.721281 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-combined-ca-bundle\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.730667 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-588bd4c977-78cs7"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.732208 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.733362 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k45br\" (UniqueName: \"kubernetes.io/projected/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-kube-api-access-k45br\") pod \"cinder-db-sync-4vdfm\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.746283 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc4gq\" (UniqueName: \"kubernetes.io/projected/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-kube-api-access-jc4gq\") pod \"neutron-db-sync-n4nzj\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.758161 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ffnrs"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.759351 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.767855 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.769474 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fjt9j" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.769633 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.770213 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bp4j\" (UniqueName: \"kubernetes.io/projected/5d3f0b27-f6d2-44ef-8e81-4052749cd681-kube-api-access-4bp4j\") pod \"horizon-6948db496c-b4lj8\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.772954 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ffnrs"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.783474 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-588bd4c977-78cs7"] Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.801365 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lbg2\" (UniqueName: \"kubernetes.io/projected/c940d6f6-235b-4817-b022-b5d783c98a5b-kube-api-access-8lbg2\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.801750 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.802993 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-config-data\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.803097 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-scripts\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.803172 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c940d6f6-235b-4817-b022-b5d783c98a5b-logs\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.803260 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-config-data\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.803572 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.803666 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-scripts\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.803746 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-run-httpd\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.803820 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd6bd2c7-299c-40f4-ab82-240091e39764-logs\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.803898 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stb64\" (UniqueName: \"kubernetes.io/projected/dd6bd2c7-299c-40f4-ab82-240091e39764-kube-api-access-stb64\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.803977 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-config-data\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.804089 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-log-httpd\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.804170 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd6bd2c7-299c-40f4-ab82-240091e39764-horizon-secret-key\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.804250 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-897tw\" (UniqueName: \"kubernetes.io/projected/624871de-b62e-4eae-a220-a5d34995919d-kube-api-access-897tw\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.804334 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-scripts\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.804412 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-combined-ca-bundle\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.801702 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.808365 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-log-httpd\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.808611 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-run-httpd\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.812336 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.838568 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.868356 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.878049 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-scripts\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.895478 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-897tw\" (UniqueName: \"kubernetes.io/projected/624871de-b62e-4eae-a220-a5d34995919d-kube-api-access-897tw\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.897082 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-config-data\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.899880 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905575 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-combined-ca-bundle\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905658 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lbg2\" (UniqueName: \"kubernetes.io/projected/c940d6f6-235b-4817-b022-b5d783c98a5b-kube-api-access-8lbg2\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905708 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-scripts\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905743 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c940d6f6-235b-4817-b022-b5d783c98a5b-logs\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905777 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-config-data\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905812 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-scripts\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905838 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd6bd2c7-299c-40f4-ab82-240091e39764-logs\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905863 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stb64\" (UniqueName: \"kubernetes.io/projected/dd6bd2c7-299c-40f4-ab82-240091e39764-kube-api-access-stb64\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905882 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-config-data\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.905919 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd6bd2c7-299c-40f4-ab82-240091e39764-horizon-secret-key\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.906715 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd6bd2c7-299c-40f4-ab82-240091e39764-logs\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.926957 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-scripts\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.929007 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-config-data\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.930763 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-combined-ca-bundle\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.936242 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd6bd2c7-299c-40f4-ab82-240091e39764-horizon-secret-key\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.938832 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-config-data\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.940262 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c940d6f6-235b-4817-b022-b5d783c98a5b-logs\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.958375 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:21:50 crc kubenswrapper[4585]: I0215 17:21:50.967636 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lbg2\" (UniqueName: \"kubernetes.io/projected/c940d6f6-235b-4817-b022-b5d783c98a5b-kube-api-access-8lbg2\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.004296 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-scripts\") pod \"placement-db-sync-ffnrs\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.021432 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stb64\" (UniqueName: \"kubernetes.io/projected/dd6bd2c7-299c-40f4-ab82-240091e39764-kube-api-access-stb64\") pod \"horizon-588bd4c977-78cs7\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.056496 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lxs78"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.081845 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.082763 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5zw55"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.082861 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.096570 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.098792 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.104507 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-h7p4c" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.106300 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ffnrs" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.110948 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lxs78"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.141534 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.144813 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.147946 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-trff6" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.147996 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.156147 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.156552 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.168930 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5zw55"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.193322 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225702 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225778 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225813 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-logs\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225841 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225867 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225894 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225909 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tppmx\" (UniqueName: \"kubernetes.io/projected/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-kube-api-access-tppmx\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225925 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf6r6\" (UniqueName: \"kubernetes.io/projected/3701151b-dc31-421f-a1e1-9d694e13bc86-kube-api-access-kf6r6\") pod \"barbican-db-sync-5zw55\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225953 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.225981 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.226001 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.226025 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.226041 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-config\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.226067 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-combined-ca-bundle\") pod \"barbican-db-sync-5zw55\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.226087 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vknjr\" (UniqueName: \"kubernetes.io/projected/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-kube-api-access-vknjr\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.226106 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.226345 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-db-sync-config-data\") pod \"barbican-db-sync-5zw55\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.242642 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.287840 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.291861 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.292015 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.319209 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330363 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-combined-ca-bundle\") pod \"barbican-db-sync-5zw55\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330403 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vknjr\" (UniqueName: \"kubernetes.io/projected/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-kube-api-access-vknjr\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330426 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330449 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-db-sync-config-data\") pod \"barbican-db-sync-5zw55\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330467 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330488 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330522 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-config-data\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330539 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330570 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330585 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-scripts\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330618 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330639 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/dde51957-2da9-4733-bdaf-130554710119-kube-api-access-hszpv\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330666 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-logs\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330712 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330737 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330762 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330853 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330892 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tppmx\" (UniqueName: \"kubernetes.io/projected/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-kube-api-access-tppmx\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330917 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf6r6\" (UniqueName: \"kubernetes.io/projected/3701151b-dc31-421f-a1e1-9d694e13bc86-kube-api-access-kf6r6\") pod \"barbican-db-sync-5zw55\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.330938 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-logs\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.331007 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.331070 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.331102 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.331141 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.331159 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-config\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.331934 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-config\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.344524 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.344709 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.345388 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.345679 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-logs\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.346322 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.357525 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.357774 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.357827 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.357898 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.362304 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.371958 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vknjr\" (UniqueName: \"kubernetes.io/projected/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-kube-api-access-vknjr\") pod \"dnsmasq-dns-56df8fb6b7-lxs78\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.376110 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-combined-ca-bundle\") pod \"barbican-db-sync-5zw55\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.390408 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf6r6\" (UniqueName: \"kubernetes.io/projected/3701151b-dc31-421f-a1e1-9d694e13bc86-kube-api-access-kf6r6\") pod \"barbican-db-sync-5zw55\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.435005 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.435062 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-config-data\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.435079 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.435106 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-scripts\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.435125 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.435146 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/dde51957-2da9-4733-bdaf-130554710119-kube-api-access-hszpv\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.435189 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.435233 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-logs\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.435980 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-logs\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.436169 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.449489 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.454851 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-scripts\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.456137 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.466497 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-db-sync-config-data\") pod \"barbican-db-sync-5zw55\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.477874 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5zw55" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.478932 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.484833 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/dde51957-2da9-4733-bdaf-130554710119-kube-api-access-hszpv\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.509807 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gp6zx"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.532404 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tppmx\" (UniqueName: \"kubernetes.io/projected/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-kube-api-access-tppmx\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.533253 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.534765 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-config-data\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.540808 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.545318 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.599391 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.751262 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4vdfm"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.793821 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: W0215 17:21:51.844956 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad17e3ba_5aaf_4c1a_8490_0e3c1c56aaff.slice/crio-646dc1e86fa59bc74cc670719e08b87183dcd76de730b78e69ccb85da6ea36b6 WatchSource:0}: Error finding container 646dc1e86fa59bc74cc670719e08b87183dcd76de730b78e69ccb85da6ea36b6: Status 404 returned error can't find the container with id 646dc1e86fa59bc74cc670719e08b87183dcd76de730b78e69ccb85da6ea36b6 Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.867476 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.888756 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-n4nzj"] Feb 15 17:21:51 crc kubenswrapper[4585]: I0215 17:21:51.907019 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-htmph"] Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.014133 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6948db496c-b4lj8"] Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.113421 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n4nzj" event={"ID":"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208","Type":"ContainerStarted","Data":"831cdebe7944a24c7febf6a674826d5465f3052af2e8657490903a81ffc2da24"} Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.130882 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6948db496c-b4lj8" event={"ID":"5d3f0b27-f6d2-44ef-8e81-4052749cd681","Type":"ContainerStarted","Data":"cf42ccc6d249e929cd304794829a796208dc14e8f06ced95bc170ed3415d224b"} Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.134408 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4vdfm" event={"ID":"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff","Type":"ContainerStarted","Data":"646dc1e86fa59bc74cc670719e08b87183dcd76de730b78e69ccb85da6ea36b6"} Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.141237 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-htmph" event={"ID":"d58f1d37-b0fa-4311-9ceb-1187d130c6e5","Type":"ContainerStarted","Data":"715ffbab6fc9362e6b21e8119e67dad29e7b096fb0bc754e4d2823f84a302cc6"} Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.146753 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gp6zx" event={"ID":"74e6cd3f-191f-4820-bf85-e431e1dbe0b3","Type":"ContainerStarted","Data":"037e91d03ec7c9c87275dc13931713b441eda8b73eb8d3a099740e80e2bf4de8"} Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.402803 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.581409 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-588bd4c977-78cs7"] Feb 15 17:21:52 crc kubenswrapper[4585]: W0215 17:21:52.637011 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd6bd2c7_299c_40f4_ab82_240091e39764.slice/crio-44440ddd640984dc07c6fc034a078c51cf8ba2910842d3ef19f2bac92a813444 WatchSource:0}: Error finding container 44440ddd640984dc07c6fc034a078c51cf8ba2910842d3ef19f2bac92a813444: Status 404 returned error can't find the container with id 44440ddd640984dc07c6fc034a078c51cf8ba2910842d3ef19f2bac92a813444 Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.706284 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ffnrs"] Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.829520 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lxs78"] Feb 15 17:21:52 crc kubenswrapper[4585]: I0215 17:21:52.877570 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5zw55"] Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.196877 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.218952 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n4nzj" event={"ID":"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208","Type":"ContainerStarted","Data":"5bf1fcaa90b05a0c96ca326dbe9bccfacc8b5baf8af9a1934e391e9e7d721635"} Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.246174 4585 generic.go:334] "Generic (PLEG): container finished" podID="d58f1d37-b0fa-4311-9ceb-1187d130c6e5" containerID="a6d126d8a847f9d3e3cf6285353793294ee7055109c25a4a1a8a76d3a88fddd1" exitCode=0 Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.233583 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.250797 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588bd4c977-78cs7" event={"ID":"dd6bd2c7-299c-40f4-ab82-240091e39764","Type":"ContainerStarted","Data":"44440ddd640984dc07c6fc034a078c51cf8ba2910842d3ef19f2bac92a813444"} Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.250830 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5zw55" event={"ID":"3701151b-dc31-421f-a1e1-9d694e13bc86","Type":"ContainerStarted","Data":"ae7f85c8ca84766c0484becc447138fe744fb88beef840b61bcea59d60e7c4b4"} Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.250842 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" event={"ID":"f62ae2f9-29ff-4f97-951f-7ef86c863a7d","Type":"ContainerStarted","Data":"5e1b2c39432bddcb5a508f9d5c2428b8409e0b71e41659a8a5318daf9b159a1f"} Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.250855 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-htmph" event={"ID":"d58f1d37-b0fa-4311-9ceb-1187d130c6e5","Type":"ContainerDied","Data":"a6d126d8a847f9d3e3cf6285353793294ee7055109c25a4a1a8a76d3a88fddd1"} Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.272858 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerStarted","Data":"ce509fcbd23f9447680f803e5210fb11b059a69f3bcc7517d301d203ded68af2"} Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.294823 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gp6zx" event={"ID":"74e6cd3f-191f-4820-bf85-e431e1dbe0b3","Type":"ContainerStarted","Data":"516a444534b85d854510fea82808ee1e24f78fb82f4510d73ab0069c17f25241"} Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.300831 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-n4nzj" podStartSLOduration=3.300818853 podStartE2EDuration="3.300818853s" podCreationTimestamp="2026-02-15 17:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:53.293031191 +0000 UTC m=+969.236439323" watchObservedRunningTime="2026-02-15 17:21:53.300818853 +0000 UTC m=+969.244226985" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.329054 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ffnrs" event={"ID":"c940d6f6-235b-4817-b022-b5d783c98a5b","Type":"ContainerStarted","Data":"98b6382847d962adb8a3485b3a9943a5f3288556e8ae6992a5bb223b936f06df"} Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.354496 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-588bd4c977-78cs7"] Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.428787 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c98d54745-hg9jf"] Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.431592 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.436573 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gp6zx" podStartSLOduration=4.436556053 podStartE2EDuration="4.436556053s" podCreationTimestamp="2026-02-15 17:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:53.389284354 +0000 UTC m=+969.332692486" watchObservedRunningTime="2026-02-15 17:21:53.436556053 +0000 UTC m=+969.379964185" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.493574 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c98d54745-hg9jf"] Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.506299 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.548110 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.611808 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ea48db7-287e-47f9-9a08-5b2f153fa269-horizon-secret-key\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.611873 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfvdh\" (UniqueName: \"kubernetes.io/projected/0ea48db7-287e-47f9-9a08-5b2f153fa269-kube-api-access-pfvdh\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.611947 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ea48db7-287e-47f9-9a08-5b2f153fa269-logs\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.611975 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-scripts\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.612039 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-config-data\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.718345 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfvdh\" (UniqueName: \"kubernetes.io/projected/0ea48db7-287e-47f9-9a08-5b2f153fa269-kube-api-access-pfvdh\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.718450 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ea48db7-287e-47f9-9a08-5b2f153fa269-logs\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.718477 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-scripts\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.718542 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-config-data\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.718673 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ea48db7-287e-47f9-9a08-5b2f153fa269-horizon-secret-key\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.720264 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ea48db7-287e-47f9-9a08-5b2f153fa269-logs\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.720985 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-scripts\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.721769 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-config-data\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.737906 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ea48db7-287e-47f9-9a08-5b2f153fa269-horizon-secret-key\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.745816 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfvdh\" (UniqueName: \"kubernetes.io/projected/0ea48db7-287e-47f9-9a08-5b2f153fa269-kube-api-access-pfvdh\") pod \"horizon-5c98d54745-hg9jf\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.833782 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:21:53 crc kubenswrapper[4585]: I0215 17:21:53.973449 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.041771 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.237575 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q25f8\" (UniqueName: \"kubernetes.io/projected/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-kube-api-access-q25f8\") pod \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.243462 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-svc\") pod \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.243643 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-nb\") pod \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.244077 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-swift-storage-0\") pod \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.245865 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-config\") pod \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.246046 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-sb\") pod \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\" (UID: \"d58f1d37-b0fa-4311-9ceb-1187d130c6e5\") " Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.294846 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-kube-api-access-q25f8" (OuterVolumeSpecName: "kube-api-access-q25f8") pod "d58f1d37-b0fa-4311-9ceb-1187d130c6e5" (UID: "d58f1d37-b0fa-4311-9ceb-1187d130c6e5"). InnerVolumeSpecName "kube-api-access-q25f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.298077 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-config" (OuterVolumeSpecName: "config") pod "d58f1d37-b0fa-4311-9ceb-1187d130c6e5" (UID: "d58f1d37-b0fa-4311-9ceb-1187d130c6e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.311311 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d58f1d37-b0fa-4311-9ceb-1187d130c6e5" (UID: "d58f1d37-b0fa-4311-9ceb-1187d130c6e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.311858 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d58f1d37-b0fa-4311-9ceb-1187d130c6e5" (UID: "d58f1d37-b0fa-4311-9ceb-1187d130c6e5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.336045 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d58f1d37-b0fa-4311-9ceb-1187d130c6e5" (UID: "d58f1d37-b0fa-4311-9ceb-1187d130c6e5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.342167 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d58f1d37-b0fa-4311-9ceb-1187d130c6e5" (UID: "d58f1d37-b0fa-4311-9ceb-1187d130c6e5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.348048 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.348074 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.348086 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q25f8\" (UniqueName: \"kubernetes.io/projected/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-kube-api-access-q25f8\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.348095 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.348102 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.348111 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d58f1d37-b0fa-4311-9ceb-1187d130c6e5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.459269 4585 generic.go:334] "Generic (PLEG): container finished" podID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" containerID="9967ef3aa0741ff7ceb243269fe1886601e702aa586c4bdf6c890a1a76fcf80f" exitCode=0 Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.459371 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" event={"ID":"f62ae2f9-29ff-4f97-951f-7ef86c863a7d","Type":"ContainerDied","Data":"9967ef3aa0741ff7ceb243269fe1886601e702aa586c4bdf6c890a1a76fcf80f"} Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.467023 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-htmph" event={"ID":"d58f1d37-b0fa-4311-9ceb-1187d130c6e5","Type":"ContainerDied","Data":"715ffbab6fc9362e6b21e8119e67dad29e7b096fb0bc754e4d2823f84a302cc6"} Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.467065 4585 scope.go:117] "RemoveContainer" containerID="a6d126d8a847f9d3e3cf6285353793294ee7055109c25a4a1a8a76d3a88fddd1" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.467178 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-htmph" Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.516900 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"092684ee-0069-4a1c-aeeb-7d7aa694d6c1","Type":"ContainerStarted","Data":"1303a8375a87dac576c118a30fc6cd7f02943aadb9b94a7a1c727faa9244607e"} Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.542309 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dde51957-2da9-4733-bdaf-130554710119","Type":"ContainerStarted","Data":"16f62995870f5a13f13acbe049ce5b58a2545c5aaf67c26ce1e91a7c371c36c9"} Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.694046 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-htmph"] Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.719310 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-htmph"] Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.731975 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c98d54745-hg9jf"] Feb 15 17:21:54 crc kubenswrapper[4585]: I0215 17:21:54.856640 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d58f1d37-b0fa-4311-9ceb-1187d130c6e5" path="/var/lib/kubelet/pods/d58f1d37-b0fa-4311-9ceb-1187d130c6e5/volumes" Feb 15 17:21:55 crc kubenswrapper[4585]: I0215 17:21:55.591440 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c98d54745-hg9jf" event={"ID":"0ea48db7-287e-47f9-9a08-5b2f153fa269","Type":"ContainerStarted","Data":"9ec53a0362f05adb5488bd2e8139df4b4c67b39d8a1e1193eef0bf547ddc6f2d"} Feb 15 17:21:55 crc kubenswrapper[4585]: I0215 17:21:55.622573 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"092684ee-0069-4a1c-aeeb-7d7aa694d6c1","Type":"ContainerStarted","Data":"51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9"} Feb 15 17:21:55 crc kubenswrapper[4585]: I0215 17:21:55.643393 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" event={"ID":"f62ae2f9-29ff-4f97-951f-7ef86c863a7d","Type":"ContainerStarted","Data":"beff0f4fdea5b332649031eb9ba34409331e0085aceb343594ce8e5376b2ba58"} Feb 15 17:21:55 crc kubenswrapper[4585]: I0215 17:21:55.643650 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:21:55 crc kubenswrapper[4585]: I0215 17:21:55.662567 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" podStartSLOduration=5.662550606 podStartE2EDuration="5.662550606s" podCreationTimestamp="2026-02-15 17:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:55.659384979 +0000 UTC m=+971.602793131" watchObservedRunningTime="2026-02-15 17:21:55.662550606 +0000 UTC m=+971.605958738" Feb 15 17:21:56 crc kubenswrapper[4585]: I0215 17:21:56.669426 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dde51957-2da9-4733-bdaf-130554710119","Type":"ContainerStarted","Data":"586f194247acb2707459549d0601ca08973f40620fece336c97c506e76cfae71"} Feb 15 17:21:57 crc kubenswrapper[4585]: I0215 17:21:57.687315 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dde51957-2da9-4733-bdaf-130554710119","Type":"ContainerStarted","Data":"c1dc634a11779869acaf6c3afb5529d072c59d4c94aaeb0495d0716b31192a2f"} Feb 15 17:21:57 crc kubenswrapper[4585]: I0215 17:21:57.687387 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dde51957-2da9-4733-bdaf-130554710119" containerName="glance-log" containerID="cri-o://586f194247acb2707459549d0601ca08973f40620fece336c97c506e76cfae71" gracePeriod=30 Feb 15 17:21:57 crc kubenswrapper[4585]: I0215 17:21:57.688869 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dde51957-2da9-4733-bdaf-130554710119" containerName="glance-httpd" containerID="cri-o://c1dc634a11779869acaf6c3afb5529d072c59d4c94aaeb0495d0716b31192a2f" gracePeriod=30 Feb 15 17:21:57 crc kubenswrapper[4585]: I0215 17:21:57.702576 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"092684ee-0069-4a1c-aeeb-7d7aa694d6c1","Type":"ContainerStarted","Data":"8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762"} Feb 15 17:21:57 crc kubenswrapper[4585]: I0215 17:21:57.702803 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerName="glance-log" containerID="cri-o://51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9" gracePeriod=30 Feb 15 17:21:57 crc kubenswrapper[4585]: I0215 17:21:57.703102 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerName="glance-httpd" containerID="cri-o://8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762" gracePeriod=30 Feb 15 17:21:57 crc kubenswrapper[4585]: I0215 17:21:57.725358 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.72534281 podStartE2EDuration="7.72534281s" podCreationTimestamp="2026-02-15 17:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:57.720877097 +0000 UTC m=+973.664285229" watchObservedRunningTime="2026-02-15 17:21:57.72534281 +0000 UTC m=+973.668750942" Feb 15 17:21:57 crc kubenswrapper[4585]: I0215 17:21:57.762658 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.762637916 podStartE2EDuration="7.762637916s" podCreationTimestamp="2026-02-15 17:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:21:57.748169212 +0000 UTC m=+973.691577344" watchObservedRunningTime="2026-02-15 17:21:57.762637916 +0000 UTC m=+973.706046048" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.522451 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.584254 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-httpd-run\") pod \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.584723 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-scripts\") pod \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.584761 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tppmx\" (UniqueName: \"kubernetes.io/projected/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-kube-api-access-tppmx\") pod \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.585016 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "092684ee-0069-4a1c-aeeb-7d7aa694d6c1" (UID: "092684ee-0069-4a1c-aeeb-7d7aa694d6c1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.585078 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-internal-tls-certs\") pod \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.585099 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-config-data\") pod \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.585115 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-combined-ca-bundle\") pod \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.585259 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.585720 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-logs\") pod \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\" (UID: \"092684ee-0069-4a1c-aeeb-7d7aa694d6c1\") " Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.586498 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.586786 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-logs" (OuterVolumeSpecName: "logs") pod "092684ee-0069-4a1c-aeeb-7d7aa694d6c1" (UID: "092684ee-0069-4a1c-aeeb-7d7aa694d6c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.606410 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-scripts" (OuterVolumeSpecName: "scripts") pod "092684ee-0069-4a1c-aeeb-7d7aa694d6c1" (UID: "092684ee-0069-4a1c-aeeb-7d7aa694d6c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.610714 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "092684ee-0069-4a1c-aeeb-7d7aa694d6c1" (UID: "092684ee-0069-4a1c-aeeb-7d7aa694d6c1"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.622552 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-kube-api-access-tppmx" (OuterVolumeSpecName: "kube-api-access-tppmx") pod "092684ee-0069-4a1c-aeeb-7d7aa694d6c1" (UID: "092684ee-0069-4a1c-aeeb-7d7aa694d6c1"). InnerVolumeSpecName "kube-api-access-tppmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.652580 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "092684ee-0069-4a1c-aeeb-7d7aa694d6c1" (UID: "092684ee-0069-4a1c-aeeb-7d7aa694d6c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.676062 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-config-data" (OuterVolumeSpecName: "config-data") pod "092684ee-0069-4a1c-aeeb-7d7aa694d6c1" (UID: "092684ee-0069-4a1c-aeeb-7d7aa694d6c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.688055 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.688184 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tppmx\" (UniqueName: \"kubernetes.io/projected/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-kube-api-access-tppmx\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.688247 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.688328 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.688404 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.688458 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.697892 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "092684ee-0069-4a1c-aeeb-7d7aa694d6c1" (UID: "092684ee-0069-4a1c-aeeb-7d7aa694d6c1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.738960 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.752721 4585 generic.go:334] "Generic (PLEG): container finished" podID="dde51957-2da9-4733-bdaf-130554710119" containerID="c1dc634a11779869acaf6c3afb5529d072c59d4c94aaeb0495d0716b31192a2f" exitCode=0 Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.752751 4585 generic.go:334] "Generic (PLEG): container finished" podID="dde51957-2da9-4733-bdaf-130554710119" containerID="586f194247acb2707459549d0601ca08973f40620fece336c97c506e76cfae71" exitCode=143 Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.752809 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dde51957-2da9-4733-bdaf-130554710119","Type":"ContainerDied","Data":"c1dc634a11779869acaf6c3afb5529d072c59d4c94aaeb0495d0716b31192a2f"} Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.752854 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dde51957-2da9-4733-bdaf-130554710119","Type":"ContainerDied","Data":"586f194247acb2707459549d0601ca08973f40620fece336c97c506e76cfae71"} Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.756320 4585 generic.go:334] "Generic (PLEG): container finished" podID="74e6cd3f-191f-4820-bf85-e431e1dbe0b3" containerID="516a444534b85d854510fea82808ee1e24f78fb82f4510d73ab0069c17f25241" exitCode=0 Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.756351 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gp6zx" event={"ID":"74e6cd3f-191f-4820-bf85-e431e1dbe0b3","Type":"ContainerDied","Data":"516a444534b85d854510fea82808ee1e24f78fb82f4510d73ab0069c17f25241"} Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.776027 4585 generic.go:334] "Generic (PLEG): container finished" podID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerID="8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762" exitCode=0 Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.776175 4585 generic.go:334] "Generic (PLEG): container finished" podID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerID="51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9" exitCode=143 Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.776153 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.776118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"092684ee-0069-4a1c-aeeb-7d7aa694d6c1","Type":"ContainerDied","Data":"8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762"} Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.776260 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"092684ee-0069-4a1c-aeeb-7d7aa694d6c1","Type":"ContainerDied","Data":"51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9"} Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.776275 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"092684ee-0069-4a1c-aeeb-7d7aa694d6c1","Type":"ContainerDied","Data":"1303a8375a87dac576c118a30fc6cd7f02943aadb9b94a7a1c727faa9244607e"} Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.776291 4585 scope.go:117] "RemoveContainer" containerID="8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.801483 4585 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/092684ee-0069-4a1c-aeeb-7d7aa694d6c1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.801513 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.889803 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.923164 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.954137 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:21:58 crc kubenswrapper[4585]: E0215 17:21:58.954775 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerName="glance-httpd" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.954795 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerName="glance-httpd" Feb 15 17:21:58 crc kubenswrapper[4585]: E0215 17:21:58.954825 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerName="glance-log" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.954832 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerName="glance-log" Feb 15 17:21:58 crc kubenswrapper[4585]: E0215 17:21:58.954857 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58f1d37-b0fa-4311-9ceb-1187d130c6e5" containerName="init" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.954864 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58f1d37-b0fa-4311-9ceb-1187d130c6e5" containerName="init" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.955112 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerName="glance-log" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.955130 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58f1d37-b0fa-4311-9ceb-1187d130c6e5" containerName="init" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.955153 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" containerName="glance-httpd" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.956515 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.958453 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.959022 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 15 17:21:58 crc kubenswrapper[4585]: I0215 17:21:58.972478 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.013895 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.013951 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.013994 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.014044 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfrdc\" (UniqueName: \"kubernetes.io/projected/42b27229-f2f4-46cc-9070-f9e65aacdc0c-kube-api-access-qfrdc\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.014061 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-logs\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.014143 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.014254 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.014286 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.115660 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-logs\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.115771 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.115917 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.115955 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.115992 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.116017 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.116040 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.116072 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrdc\" (UniqueName: \"kubernetes.io/projected/42b27229-f2f4-46cc-9070-f9e65aacdc0c-kube-api-access-qfrdc\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.116805 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-logs\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.118013 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.118243 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.142940 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.142984 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfrdc\" (UniqueName: \"kubernetes.io/projected/42b27229-f2f4-46cc-9070-f9e65aacdc0c-kube-api-access-qfrdc\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.144174 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.160487 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.169860 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.177145 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.277306 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.565425 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6948db496c-b4lj8"] Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.601759 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b9f5444b-8n6qh"] Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.603669 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.605840 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.626718 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-logs\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.626753 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-secret-key\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.626772 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-combined-ca-bundle\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.626819 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfg8t\" (UniqueName: \"kubernetes.io/projected/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-kube-api-access-bfg8t\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.626846 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-scripts\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.626865 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-tls-certs\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.626925 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-config-data\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.630647 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b9f5444b-8n6qh"] Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.694667 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c98d54745-hg9jf"] Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.733648 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-logs\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.733687 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-secret-key\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.733709 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-combined-ca-bundle\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.733753 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfg8t\" (UniqueName: \"kubernetes.io/projected/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-kube-api-access-bfg8t\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.733775 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-scripts\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.733790 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-tls-certs\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.733809 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-config-data\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.734878 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-config-data\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.735454 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-scripts\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.735938 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-logs\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.736223 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fb7dd448-vc5x5"] Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.745466 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-secret-key\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.746061 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-combined-ca-bundle\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.763672 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfg8t\" (UniqueName: \"kubernetes.io/projected/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-kube-api-access-bfg8t\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.764238 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.764328 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.765529 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-tls-certs\") pod \"horizon-7b9f5444b-8n6qh\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.767392 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fb7dd448-vc5x5"] Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.835722 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b1bd46e7-0703-49b5-81f2-516568284547-horizon-secret-key\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.835766 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1bd46e7-0703-49b5-81f2-516568284547-combined-ca-bundle\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.835799 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9647g\" (UniqueName: \"kubernetes.io/projected/b1bd46e7-0703-49b5-81f2-516568284547-kube-api-access-9647g\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.835823 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b1bd46e7-0703-49b5-81f2-516568284547-config-data\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.835864 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1bd46e7-0703-49b5-81f2-516568284547-horizon-tls-certs\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.835883 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1bd46e7-0703-49b5-81f2-516568284547-scripts\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.835934 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1bd46e7-0703-49b5-81f2-516568284547-logs\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.930801 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.936826 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9647g\" (UniqueName: \"kubernetes.io/projected/b1bd46e7-0703-49b5-81f2-516568284547-kube-api-access-9647g\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.936872 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b1bd46e7-0703-49b5-81f2-516568284547-config-data\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.936920 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1bd46e7-0703-49b5-81f2-516568284547-horizon-tls-certs\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.936941 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1bd46e7-0703-49b5-81f2-516568284547-scripts\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.936996 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1bd46e7-0703-49b5-81f2-516568284547-logs\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.937098 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b1bd46e7-0703-49b5-81f2-516568284547-horizon-secret-key\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.937120 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1bd46e7-0703-49b5-81f2-516568284547-combined-ca-bundle\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.937495 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1bd46e7-0703-49b5-81f2-516568284547-logs\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.937924 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1bd46e7-0703-49b5-81f2-516568284547-scripts\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.938201 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b1bd46e7-0703-49b5-81f2-516568284547-config-data\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.950975 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b1bd46e7-0703-49b5-81f2-516568284547-horizon-secret-key\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.955039 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9647g\" (UniqueName: \"kubernetes.io/projected/b1bd46e7-0703-49b5-81f2-516568284547-kube-api-access-9647g\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.968269 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1bd46e7-0703-49b5-81f2-516568284547-combined-ca-bundle\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:21:59 crc kubenswrapper[4585]: I0215 17:21:59.972177 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1bd46e7-0703-49b5-81f2-516568284547-horizon-tls-certs\") pod \"horizon-5fb7dd448-vc5x5\" (UID: \"b1bd46e7-0703-49b5-81f2-516568284547\") " pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:22:00 crc kubenswrapper[4585]: I0215 17:22:00.176526 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:22:00 crc kubenswrapper[4585]: I0215 17:22:00.855210 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092684ee-0069-4a1c-aeeb-7d7aa694d6c1" path="/var/lib/kubelet/pods/092684ee-0069-4a1c-aeeb-7d7aa694d6c1/volumes" Feb 15 17:22:01 crc kubenswrapper[4585]: I0215 17:22:01.451480 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:22:01 crc kubenswrapper[4585]: I0215 17:22:01.551527 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-jz6b2"] Feb 15 17:22:01 crc kubenswrapper[4585]: I0215 17:22:01.554226 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerName="dnsmasq-dns" containerID="cri-o://91d06bbff1d6ef96ab116258859bb62b9d5af3d2dc475f9da4beb0f557e51a91" gracePeriod=10 Feb 15 17:22:01 crc kubenswrapper[4585]: I0215 17:22:01.853822 4585 generic.go:334] "Generic (PLEG): container finished" podID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerID="91d06bbff1d6ef96ab116258859bb62b9d5af3d2dc475f9da4beb0f557e51a91" exitCode=0 Feb 15 17:22:01 crc kubenswrapper[4585]: I0215 17:22:01.854065 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" event={"ID":"aee7bb0b-d27a-45c0-b514-a29314db0609","Type":"ContainerDied","Data":"91d06bbff1d6ef96ab116258859bb62b9d5af3d2dc475f9da4beb0f557e51a91"} Feb 15 17:22:03 crc kubenswrapper[4585]: I0215 17:22:03.419537 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: connect: connection refused" Feb 15 17:22:13 crc kubenswrapper[4585]: E0215 17:22:13.067177 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Feb 15 17:22:13 crc kubenswrapper[4585]: E0215 17:22:13.067855 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8lbg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-ffnrs_openstack(c940d6f6-235b-4817-b022-b5d783c98a5b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:22:13 crc kubenswrapper[4585]: E0215 17:22:13.068949 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-ffnrs" podUID="c940d6f6-235b-4817-b022-b5d783c98a5b" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.199203 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.332421 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-config-data\") pod \"dde51957-2da9-4733-bdaf-130554710119\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.332487 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-combined-ca-bundle\") pod \"dde51957-2da9-4733-bdaf-130554710119\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.332573 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-logs\") pod \"dde51957-2da9-4733-bdaf-130554710119\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.332627 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-public-tls-certs\") pod \"dde51957-2da9-4733-bdaf-130554710119\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.332658 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/dde51957-2da9-4733-bdaf-130554710119-kube-api-access-hszpv\") pod \"dde51957-2da9-4733-bdaf-130554710119\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.332688 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"dde51957-2da9-4733-bdaf-130554710119\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.332737 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-httpd-run\") pod \"dde51957-2da9-4733-bdaf-130554710119\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.332781 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-scripts\") pod \"dde51957-2da9-4733-bdaf-130554710119\" (UID: \"dde51957-2da9-4733-bdaf-130554710119\") " Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.339218 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-logs" (OuterVolumeSpecName: "logs") pod "dde51957-2da9-4733-bdaf-130554710119" (UID: "dde51957-2da9-4733-bdaf-130554710119"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.339739 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dde51957-2da9-4733-bdaf-130554710119" (UID: "dde51957-2da9-4733-bdaf-130554710119"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.344034 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde51957-2da9-4733-bdaf-130554710119-kube-api-access-hszpv" (OuterVolumeSpecName: "kube-api-access-hszpv") pod "dde51957-2da9-4733-bdaf-130554710119" (UID: "dde51957-2da9-4733-bdaf-130554710119"). InnerVolumeSpecName "kube-api-access-hszpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.349115 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "dde51957-2da9-4733-bdaf-130554710119" (UID: "dde51957-2da9-4733-bdaf-130554710119"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.375908 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-scripts" (OuterVolumeSpecName: "scripts") pod "dde51957-2da9-4733-bdaf-130554710119" (UID: "dde51957-2da9-4733-bdaf-130554710119"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.406364 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dde51957-2da9-4733-bdaf-130554710119" (UID: "dde51957-2da9-4733-bdaf-130554710119"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.419715 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: i/o timeout" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.434675 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.434766 4585 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.434828 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hszpv\" (UniqueName: \"kubernetes.io/projected/dde51957-2da9-4733-bdaf-130554710119-kube-api-access-hszpv\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.434899 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.437668 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dde51957-2da9-4733-bdaf-130554710119-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.437749 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.455206 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dde51957-2da9-4733-bdaf-130554710119" (UID: "dde51957-2da9-4733-bdaf-130554710119"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.463959 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.500659 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-config-data" (OuterVolumeSpecName: "config-data") pod "dde51957-2da9-4733-bdaf-130554710119" (UID: "dde51957-2da9-4733-bdaf-130554710119"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.539993 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.540016 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde51957-2da9-4733-bdaf-130554710119-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.540029 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:13 crc kubenswrapper[4585]: I0215 17:22:13.999332 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.001499 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dde51957-2da9-4733-bdaf-130554710119","Type":"ContainerDied","Data":"16f62995870f5a13f13acbe049ce5b58a2545c5aaf67c26ce1e91a7c371c36c9"} Feb 15 17:22:14 crc kubenswrapper[4585]: E0215 17:22:14.008389 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-ffnrs" podUID="c940d6f6-235b-4817-b022-b5d783c98a5b" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.044187 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.059445 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.102641 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:22:14 crc kubenswrapper[4585]: E0215 17:22:14.103086 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde51957-2da9-4733-bdaf-130554710119" containerName="glance-log" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.103097 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde51957-2da9-4733-bdaf-130554710119" containerName="glance-log" Feb 15 17:22:14 crc kubenswrapper[4585]: E0215 17:22:14.103120 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde51957-2da9-4733-bdaf-130554710119" containerName="glance-httpd" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.103126 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde51957-2da9-4733-bdaf-130554710119" containerName="glance-httpd" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.103320 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde51957-2da9-4733-bdaf-130554710119" containerName="glance-log" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.103333 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde51957-2da9-4733-bdaf-130554710119" containerName="glance-httpd" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.104334 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.109696 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.110321 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.124301 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.253780 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.253849 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.254045 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-logs\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.254126 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.254207 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.254298 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.254382 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.254460 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnz6d\" (UniqueName: \"kubernetes.io/projected/0b434dc6-96c7-4fc0-ba05-a37d48709a08-kube-api-access-xnz6d\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.356570 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.356653 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.356701 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-logs\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.356728 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.356753 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.356785 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.356804 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.356828 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnz6d\" (UniqueName: \"kubernetes.io/projected/0b434dc6-96c7-4fc0-ba05-a37d48709a08-kube-api-access-xnz6d\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.357527 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.358503 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.363024 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-logs\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.366919 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.374156 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-scripts\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.377242 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.378239 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-config-data\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.381012 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnz6d\" (UniqueName: \"kubernetes.io/projected/0b434dc6-96c7-4fc0-ba05-a37d48709a08-kube-api-access-xnz6d\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.394771 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.443461 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:22:14 crc kubenswrapper[4585]: I0215 17:22:14.858527 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde51957-2da9-4733-bdaf-130554710119" path="/var/lib/kubelet/pods/dde51957-2da9-4733-bdaf-130554710119/volumes" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.015937 4585 generic.go:334] "Generic (PLEG): container finished" podID="0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208" containerID="5bf1fcaa90b05a0c96ca326dbe9bccfacc8b5baf8af9a1934e391e9e7d721635" exitCode=0 Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.015978 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n4nzj" event={"ID":"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208","Type":"ContainerDied","Data":"5bf1fcaa90b05a0c96ca326dbe9bccfacc8b5baf8af9a1934e391e9e7d721635"} Feb 15 17:22:16 crc kubenswrapper[4585]: E0215 17:22:16.465626 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 15 17:22:16 crc kubenswrapper[4585]: E0215 17:22:16.466112 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68h65bh8dh7h679h65ch558h5ch585hbhc4h5b9h68dh549h75h96h646h8bh5b4h79hb8hc6h565h64fh566h5dhc5h558h575h68ch94h67dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4bp4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6948db496c-b4lj8_openstack(5d3f0b27-f6d2-44ef-8e81-4052749cd681): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:22:16 crc kubenswrapper[4585]: E0215 17:22:16.479983 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6948db496c-b4lj8" podUID="5d3f0b27-f6d2-44ef-8e81-4052749cd681" Feb 15 17:22:16 crc kubenswrapper[4585]: E0215 17:22:16.480372 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 15 17:22:16 crc kubenswrapper[4585]: E0215 17:22:16.480546 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n589h645h59bh55dhf5h689h66bh594h554h9bhd8h5d5hcch5d9h5dfhfdh59ch65ch7fh79hbhffh56fh95h66fhc6h68h5f6h56dhbch67bh5cfq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-stb64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-588bd4c977-78cs7_openstack(dd6bd2c7-299c-40f4-ab82-240091e39764): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:22:16 crc kubenswrapper[4585]: E0215 17:22:16.483754 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-588bd4c977-78cs7" podUID="dd6bd2c7-299c-40f4-ab82-240091e39764" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.578294 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.585876 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.728062 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-config-data\") pod \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.728133 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c4db\" (UniqueName: \"kubernetes.io/projected/aee7bb0b-d27a-45c0-b514-a29314db0609-kube-api-access-8c4db\") pod \"aee7bb0b-d27a-45c0-b514-a29314db0609\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.728185 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-svc\") pod \"aee7bb0b-d27a-45c0-b514-a29314db0609\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.728210 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-sb\") pod \"aee7bb0b-d27a-45c0-b514-a29314db0609\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.728250 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-scripts\") pod \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.728893 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-nb\") pod \"aee7bb0b-d27a-45c0-b514-a29314db0609\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.728976 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-combined-ca-bundle\") pod \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.729023 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-credential-keys\") pod \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.729080 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-fernet-keys\") pod \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.729127 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-config\") pod \"aee7bb0b-d27a-45c0-b514-a29314db0609\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.729171 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-swift-storage-0\") pod \"aee7bb0b-d27a-45c0-b514-a29314db0609\" (UID: \"aee7bb0b-d27a-45c0-b514-a29314db0609\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.729244 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdjsf\" (UniqueName: \"kubernetes.io/projected/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-kube-api-access-cdjsf\") pod \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\" (UID: \"74e6cd3f-191f-4820-bf85-e431e1dbe0b3\") " Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.735466 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee7bb0b-d27a-45c0-b514-a29314db0609-kube-api-access-8c4db" (OuterVolumeSpecName: "kube-api-access-8c4db") pod "aee7bb0b-d27a-45c0-b514-a29314db0609" (UID: "aee7bb0b-d27a-45c0-b514-a29314db0609"). InnerVolumeSpecName "kube-api-access-8c4db". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.737158 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-kube-api-access-cdjsf" (OuterVolumeSpecName: "kube-api-access-cdjsf") pod "74e6cd3f-191f-4820-bf85-e431e1dbe0b3" (UID: "74e6cd3f-191f-4820-bf85-e431e1dbe0b3"). InnerVolumeSpecName "kube-api-access-cdjsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.738180 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-scripts" (OuterVolumeSpecName: "scripts") pod "74e6cd3f-191f-4820-bf85-e431e1dbe0b3" (UID: "74e6cd3f-191f-4820-bf85-e431e1dbe0b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.739350 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "74e6cd3f-191f-4820-bf85-e431e1dbe0b3" (UID: "74e6cd3f-191f-4820-bf85-e431e1dbe0b3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.745117 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "74e6cd3f-191f-4820-bf85-e431e1dbe0b3" (UID: "74e6cd3f-191f-4820-bf85-e431e1dbe0b3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.807347 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aee7bb0b-d27a-45c0-b514-a29314db0609" (UID: "aee7bb0b-d27a-45c0-b514-a29314db0609"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.808568 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74e6cd3f-191f-4820-bf85-e431e1dbe0b3" (UID: "74e6cd3f-191f-4820-bf85-e431e1dbe0b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.812018 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-config-data" (OuterVolumeSpecName: "config-data") pod "74e6cd3f-191f-4820-bf85-e431e1dbe0b3" (UID: "74e6cd3f-191f-4820-bf85-e431e1dbe0b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.827787 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aee7bb0b-d27a-45c0-b514-a29314db0609" (UID: "aee7bb0b-d27a-45c0-b514-a29314db0609"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.835704 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdjsf\" (UniqueName: \"kubernetes.io/projected/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-kube-api-access-cdjsf\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.835732 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.835741 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c4db\" (UniqueName: \"kubernetes.io/projected/aee7bb0b-d27a-45c0-b514-a29314db0609-kube-api-access-8c4db\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.835750 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.835759 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.835767 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.835776 4585 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.835784 4585 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/74e6cd3f-191f-4820-bf85-e431e1dbe0b3-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.835792 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.845699 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aee7bb0b-d27a-45c0-b514-a29314db0609" (UID: "aee7bb0b-d27a-45c0-b514-a29314db0609"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.857158 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-config" (OuterVolumeSpecName: "config") pod "aee7bb0b-d27a-45c0-b514-a29314db0609" (UID: "aee7bb0b-d27a-45c0-b514-a29314db0609"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.876767 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aee7bb0b-d27a-45c0-b514-a29314db0609" (UID: "aee7bb0b-d27a-45c0-b514-a29314db0609"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.937211 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.937238 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:16 crc kubenswrapper[4585]: I0215 17:22:16.937246 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aee7bb0b-d27a-45c0-b514-a29314db0609-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.025448 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" event={"ID":"aee7bb0b-d27a-45c0-b514-a29314db0609","Type":"ContainerDied","Data":"e4d8ce6630e05eb4103ab41921e425ef6c837ee2e9deff91eece58f4f473331d"} Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.025484 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.027976 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gp6zx" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.032270 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gp6zx" event={"ID":"74e6cd3f-191f-4820-bf85-e431e1dbe0b3","Type":"ContainerDied","Data":"037e91d03ec7c9c87275dc13931713b441eda8b73eb8d3a099740e80e2bf4de8"} Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.032508 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="037e91d03ec7c9c87275dc13931713b441eda8b73eb8d3a099740e80e2bf4de8" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.101466 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-jz6b2"] Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.114061 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-jz6b2"] Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.654813 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gp6zx"] Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.662764 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gp6zx"] Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.756968 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8dngq"] Feb 15 17:22:17 crc kubenswrapper[4585]: E0215 17:22:17.757341 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e6cd3f-191f-4820-bf85-e431e1dbe0b3" containerName="keystone-bootstrap" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.757356 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e6cd3f-191f-4820-bf85-e431e1dbe0b3" containerName="keystone-bootstrap" Feb 15 17:22:17 crc kubenswrapper[4585]: E0215 17:22:17.757374 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerName="init" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.757381 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerName="init" Feb 15 17:22:17 crc kubenswrapper[4585]: E0215 17:22:17.757410 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerName="dnsmasq-dns" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.757416 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerName="dnsmasq-dns" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.757633 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e6cd3f-191f-4820-bf85-e431e1dbe0b3" containerName="keystone-bootstrap" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.757661 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerName="dnsmasq-dns" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.758200 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.761410 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.761873 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.764020 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6w8mf" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.764187 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.764651 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.774319 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8dngq"] Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.860909 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-combined-ca-bundle\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.861233 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-scripts\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.861328 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-fernet-keys\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.861420 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bpj7\" (UniqueName: \"kubernetes.io/projected/4375904d-94fb-4c2f-804c-5451f7a71c6d-kube-api-access-7bpj7\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.861532 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-credential-keys\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.861681 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-config-data\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.963834 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bpj7\" (UniqueName: \"kubernetes.io/projected/4375904d-94fb-4c2f-804c-5451f7a71c6d-kube-api-access-7bpj7\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.964141 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-credential-keys\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.964244 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-config-data\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.964625 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-combined-ca-bundle\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.964838 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-scripts\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.964925 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-fernet-keys\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.969720 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-fernet-keys\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.969789 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-credential-keys\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.970670 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-scripts\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.971947 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-config-data\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.971946 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-combined-ca-bundle\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:17 crc kubenswrapper[4585]: I0215 17:22:17.986082 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bpj7\" (UniqueName: \"kubernetes.io/projected/4375904d-94fb-4c2f-804c-5451f7a71c6d-kube-api-access-7bpj7\") pod \"keystone-bootstrap-8dngq\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:18 crc kubenswrapper[4585]: I0215 17:22:18.127282 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:18 crc kubenswrapper[4585]: I0215 17:22:18.420810 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-jz6b2" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.162:5353: i/o timeout" Feb 15 17:22:18 crc kubenswrapper[4585]: I0215 17:22:18.855327 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e6cd3f-191f-4820-bf85-e431e1dbe0b3" path="/var/lib/kubelet/pods/74e6cd3f-191f-4820-bf85-e431e1dbe0b3/volumes" Feb 15 17:22:18 crc kubenswrapper[4585]: I0215 17:22:18.856013 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee7bb0b-d27a-45c0-b514-a29314db0609" path="/var/lib/kubelet/pods/aee7bb0b-d27a-45c0-b514-a29314db0609/volumes" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.063808 4585 scope.go:117] "RemoveContainer" containerID="51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9" Feb 15 17:22:25 crc kubenswrapper[4585]: E0215 17:22:25.618370 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 15 17:22:25 crc kubenswrapper[4585]: E0215 17:22:25.618504 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kf6r6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-5zw55_openstack(3701151b-dc31-421f-a1e1-9d694e13bc86): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:22:25 crc kubenswrapper[4585]: E0215 17:22:25.619669 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-5zw55" podUID="3701151b-dc31-421f-a1e1-9d694e13bc86" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.731171 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.739215 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.792880 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.828533 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc4gq\" (UniqueName: \"kubernetes.io/projected/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-kube-api-access-jc4gq\") pod \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.828615 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-combined-ca-bundle\") pod \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.828738 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3f0b27-f6d2-44ef-8e81-4052749cd681-horizon-secret-key\") pod \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.828805 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-config-data\") pod \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.828861 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bp4j\" (UniqueName: \"kubernetes.io/projected/5d3f0b27-f6d2-44ef-8e81-4052749cd681-kube-api-access-4bp4j\") pod \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.828938 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-scripts\") pod \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.829022 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-config\") pod \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\" (UID: \"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.829070 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3f0b27-f6d2-44ef-8e81-4052749cd681-logs\") pod \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\" (UID: \"5d3f0b27-f6d2-44ef-8e81-4052749cd681\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.829437 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-config-data" (OuterVolumeSpecName: "config-data") pod "5d3f0b27-f6d2-44ef-8e81-4052749cd681" (UID: "5d3f0b27-f6d2-44ef-8e81-4052749cd681"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.829537 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-scripts" (OuterVolumeSpecName: "scripts") pod "5d3f0b27-f6d2-44ef-8e81-4052749cd681" (UID: "5d3f0b27-f6d2-44ef-8e81-4052749cd681"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.829761 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.829775 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d3f0b27-f6d2-44ef-8e81-4052749cd681-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.830106 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3f0b27-f6d2-44ef-8e81-4052749cd681-logs" (OuterVolumeSpecName: "logs") pod "5d3f0b27-f6d2-44ef-8e81-4052749cd681" (UID: "5d3f0b27-f6d2-44ef-8e81-4052749cd681"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.846312 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3f0b27-f6d2-44ef-8e81-4052749cd681-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5d3f0b27-f6d2-44ef-8e81-4052749cd681" (UID: "5d3f0b27-f6d2-44ef-8e81-4052749cd681"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.848160 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3f0b27-f6d2-44ef-8e81-4052749cd681-kube-api-access-4bp4j" (OuterVolumeSpecName: "kube-api-access-4bp4j") pod "5d3f0b27-f6d2-44ef-8e81-4052749cd681" (UID: "5d3f0b27-f6d2-44ef-8e81-4052749cd681"). InnerVolumeSpecName "kube-api-access-4bp4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.849749 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-kube-api-access-jc4gq" (OuterVolumeSpecName: "kube-api-access-jc4gq") pod "0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208" (UID: "0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208"). InnerVolumeSpecName "kube-api-access-jc4gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.871772 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208" (UID: "0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.881769 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-config" (OuterVolumeSpecName: "config") pod "0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208" (UID: "0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.930418 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd6bd2c7-299c-40f4-ab82-240091e39764-horizon-secret-key\") pod \"dd6bd2c7-299c-40f4-ab82-240091e39764\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.930554 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-config-data\") pod \"dd6bd2c7-299c-40f4-ab82-240091e39764\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.930585 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stb64\" (UniqueName: \"kubernetes.io/projected/dd6bd2c7-299c-40f4-ab82-240091e39764-kube-api-access-stb64\") pod \"dd6bd2c7-299c-40f4-ab82-240091e39764\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.930630 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd6bd2c7-299c-40f4-ab82-240091e39764-logs\") pod \"dd6bd2c7-299c-40f4-ab82-240091e39764\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.930666 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-scripts\") pod \"dd6bd2c7-299c-40f4-ab82-240091e39764\" (UID: \"dd6bd2c7-299c-40f4-ab82-240091e39764\") " Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.931228 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6bd2c7-299c-40f4-ab82-240091e39764-logs" (OuterVolumeSpecName: "logs") pod "dd6bd2c7-299c-40f4-ab82-240091e39764" (UID: "dd6bd2c7-299c-40f4-ab82-240091e39764"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.931304 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.931320 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d3f0b27-f6d2-44ef-8e81-4052749cd681-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.931331 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc4gq\" (UniqueName: \"kubernetes.io/projected/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-kube-api-access-jc4gq\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.931339 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.931348 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5d3f0b27-f6d2-44ef-8e81-4052749cd681-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.931356 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd6bd2c7-299c-40f4-ab82-240091e39764-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.931364 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bp4j\" (UniqueName: \"kubernetes.io/projected/5d3f0b27-f6d2-44ef-8e81-4052749cd681-kube-api-access-4bp4j\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.931895 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-scripts" (OuterVolumeSpecName: "scripts") pod "dd6bd2c7-299c-40f4-ab82-240091e39764" (UID: "dd6bd2c7-299c-40f4-ab82-240091e39764"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.932334 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-config-data" (OuterVolumeSpecName: "config-data") pod "dd6bd2c7-299c-40f4-ab82-240091e39764" (UID: "dd6bd2c7-299c-40f4-ab82-240091e39764"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.934826 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6bd2c7-299c-40f4-ab82-240091e39764-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dd6bd2c7-299c-40f4-ab82-240091e39764" (UID: "dd6bd2c7-299c-40f4-ab82-240091e39764"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:25 crc kubenswrapper[4585]: I0215 17:22:25.934915 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6bd2c7-299c-40f4-ab82-240091e39764-kube-api-access-stb64" (OuterVolumeSpecName: "kube-api-access-stb64") pod "dd6bd2c7-299c-40f4-ab82-240091e39764" (UID: "dd6bd2c7-299c-40f4-ab82-240091e39764"). InnerVolumeSpecName "kube-api-access-stb64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.033158 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.033187 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stb64\" (UniqueName: \"kubernetes.io/projected/dd6bd2c7-299c-40f4-ab82-240091e39764-kube-api-access-stb64\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.033196 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd6bd2c7-299c-40f4-ab82-240091e39764-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.033207 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd6bd2c7-299c-40f4-ab82-240091e39764-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.178872 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n4nzj" event={"ID":"0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208","Type":"ContainerDied","Data":"831cdebe7944a24c7febf6a674826d5465f3052af2e8657490903a81ffc2da24"} Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.178918 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="831cdebe7944a24c7febf6a674826d5465f3052af2e8657490903a81ffc2da24" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.178997 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n4nzj" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.183532 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588bd4c977-78cs7" event={"ID":"dd6bd2c7-299c-40f4-ab82-240091e39764","Type":"ContainerDied","Data":"44440ddd640984dc07c6fc034a078c51cf8ba2910842d3ef19f2bac92a813444"} Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.183680 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588bd4c977-78cs7" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.189373 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6948db496c-b4lj8" event={"ID":"5d3f0b27-f6d2-44ef-8e81-4052749cd681","Type":"ContainerDied","Data":"cf42ccc6d249e929cd304794829a796208dc14e8f06ced95bc170ed3415d224b"} Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.189737 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6948db496c-b4lj8" Feb 15 17:22:26 crc kubenswrapper[4585]: E0215 17:22:26.191444 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-5zw55" podUID="3701151b-dc31-421f-a1e1-9d694e13bc86" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.307560 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6948db496c-b4lj8"] Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.324719 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6948db496c-b4lj8"] Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.344707 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-588bd4c977-78cs7"] Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.355377 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-588bd4c977-78cs7"] Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.858464 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3f0b27-f6d2-44ef-8e81-4052749cd681" path="/var/lib/kubelet/pods/5d3f0b27-f6d2-44ef-8e81-4052749cd681/volumes" Feb 15 17:22:26 crc kubenswrapper[4585]: I0215 17:22:26.859562 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6bd2c7-299c-40f4-ab82-240091e39764" path="/var/lib/kubelet/pods/dd6bd2c7-299c-40f4-ab82-240091e39764/volumes" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.032950 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-wc2pz"] Feb 15 17:22:27 crc kubenswrapper[4585]: E0215 17:22:27.033661 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208" containerName="neutron-db-sync" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.033678 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208" containerName="neutron-db-sync" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.033877 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208" containerName="neutron-db-sync" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.034890 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.066975 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-wc2pz"] Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.167985 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.168834 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.169184 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.169251 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-svc\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.169333 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-config\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.169374 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2zvc\" (UniqueName: \"kubernetes.io/projected/6c4de367-d702-449a-b5a5-3db1c142a219-kube-api-access-v2zvc\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.180249 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-657d9d46dd-264lh"] Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.182194 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.184748 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.184779 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.184981 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.185256 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mx4pk" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.210727 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-657d9d46dd-264lh"] Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.271108 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9zmw\" (UniqueName: \"kubernetes.io/projected/83f72fb7-0fae-45bd-894b-0b8235e489eb-kube-api-access-w9zmw\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.271438 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.271538 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.271639 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-ovndb-tls-certs\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.271740 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-svc\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.271857 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-combined-ca-bundle\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.271965 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-config\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.272080 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-httpd-config\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.272211 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-config\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.272310 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.272423 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.272530 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-svc\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.272913 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-config\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.272977 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zvc\" (UniqueName: \"kubernetes.io/projected/6c4de367-d702-449a-b5a5-3db1c142a219-kube-api-access-v2zvc\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.273403 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.274036 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.291004 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2zvc\" (UniqueName: \"kubernetes.io/projected/6c4de367-d702-449a-b5a5-3db1c142a219-kube-api-access-v2zvc\") pod \"dnsmasq-dns-6b7b667979-wc2pz\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.356378 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.375151 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-config\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.375198 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-httpd-config\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.375333 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9zmw\" (UniqueName: \"kubernetes.io/projected/83f72fb7-0fae-45bd-894b-0b8235e489eb-kube-api-access-w9zmw\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.375371 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-ovndb-tls-certs\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.375397 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-combined-ca-bundle\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.378818 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-combined-ca-bundle\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.384194 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-httpd-config\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.384754 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-ovndb-tls-certs\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.394339 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-config\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.397235 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9zmw\" (UniqueName: \"kubernetes.io/projected/83f72fb7-0fae-45bd-894b-0b8235e489eb-kube-api-access-w9zmw\") pod \"neutron-657d9d46dd-264lh\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.506611 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:27 crc kubenswrapper[4585]: E0215 17:22:27.702028 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 15 17:22:27 crc kubenswrapper[4585]: E0215 17:22:27.702573 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k45br,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-4vdfm_openstack(ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:22:27 crc kubenswrapper[4585]: E0215 17:22:27.705727 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-4vdfm" podUID="ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.739688 4585 scope.go:117] "RemoveContainer" containerID="8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762" Feb 15 17:22:27 crc kubenswrapper[4585]: E0215 17:22:27.753314 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762\": container with ID starting with 8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762 not found: ID does not exist" containerID="8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.753362 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762"} err="failed to get container status \"8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762\": rpc error: code = NotFound desc = could not find container \"8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762\": container with ID starting with 8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762 not found: ID does not exist" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.753387 4585 scope.go:117] "RemoveContainer" containerID="51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9" Feb 15 17:22:27 crc kubenswrapper[4585]: E0215 17:22:27.762211 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9\": container with ID starting with 51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9 not found: ID does not exist" containerID="51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.762255 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9"} err="failed to get container status \"51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9\": rpc error: code = NotFound desc = could not find container \"51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9\": container with ID starting with 51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9 not found: ID does not exist" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.762277 4585 scope.go:117] "RemoveContainer" containerID="8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.763937 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762"} err="failed to get container status \"8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762\": rpc error: code = NotFound desc = could not find container \"8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762\": container with ID starting with 8b883dfe8f93ab59c1d1d2c268d028ddf03c82ad58cb6ff73b2a898e5ade2762 not found: ID does not exist" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.763985 4585 scope.go:117] "RemoveContainer" containerID="51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.779859 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9"} err="failed to get container status \"51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9\": rpc error: code = NotFound desc = could not find container \"51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9\": container with ID starting with 51cf985c9fb0485480bcf8049082bd9bd8b2b601b61055ae951b5c02df8134d9 not found: ID does not exist" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.779902 4585 scope.go:117] "RemoveContainer" containerID="c1dc634a11779869acaf6c3afb5529d072c59d4c94aaeb0495d0716b31192a2f" Feb 15 17:22:27 crc kubenswrapper[4585]: I0215 17:22:27.948704 4585 scope.go:117] "RemoveContainer" containerID="586f194247acb2707459549d0601ca08973f40620fece336c97c506e76cfae71" Feb 15 17:22:28 crc kubenswrapper[4585]: I0215 17:22:28.023669 4585 scope.go:117] "RemoveContainer" containerID="91d06bbff1d6ef96ab116258859bb62b9d5af3d2dc475f9da4beb0f557e51a91" Feb 15 17:22:28 crc kubenswrapper[4585]: I0215 17:22:28.080928 4585 scope.go:117] "RemoveContainer" containerID="17c206d14c9c71cd653f5060958aad40251cddde0bd091ded7f4f4ff8c3fc7fe" Feb 15 17:22:28 crc kubenswrapper[4585]: I0215 17:22:28.260909 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b9f5444b-8n6qh"] Feb 15 17:22:28 crc kubenswrapper[4585]: I0215 17:22:28.270284 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerStarted","Data":"a7500de9e2f80a57638642b20ce7bb90902d5ee491b2f0946a6106887712c69d"} Feb 15 17:22:28 crc kubenswrapper[4585]: E0215 17:22:28.272364 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-4vdfm" podUID="ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" Feb 15 17:22:28 crc kubenswrapper[4585]: I0215 17:22:28.422573 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fb7dd448-vc5x5"] Feb 15 17:22:28 crc kubenswrapper[4585]: I0215 17:22:28.528263 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-wc2pz"] Feb 15 17:22:28 crc kubenswrapper[4585]: I0215 17:22:28.557733 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8dngq"] Feb 15 17:22:28 crc kubenswrapper[4585]: I0215 17:22:28.655694 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:22:28 crc kubenswrapper[4585]: I0215 17:22:28.795750 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:22:28 crc kubenswrapper[4585]: W0215 17:22:28.808270 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b434dc6_96c7_4fc0_ba05_a37d48709a08.slice/crio-a59ac35c6f75e94cc8c874ae2642e1c3a764c0e1e7c7a6f433f75d2e827f3c9f WatchSource:0}: Error finding container a59ac35c6f75e94cc8c874ae2642e1c3a764c0e1e7c7a6f433f75d2e827f3c9f: Status 404 returned error can't find the container with id a59ac35c6f75e94cc8c874ae2642e1c3a764c0e1e7c7a6f433f75d2e827f3c9f Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.279392 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c98d54745-hg9jf" event={"ID":"0ea48db7-287e-47f9-9a08-5b2f153fa269","Type":"ContainerStarted","Data":"837e67512bff9956662e845ee1325e2cfbdafd3a1bdf16b2ccd38299a5676737"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.291093 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ffnrs" event={"ID":"c940d6f6-235b-4817-b022-b5d783c98a5b","Type":"ContainerStarted","Data":"029d2b95e658de5b81a2f9dac9fca4cf2aa230894079bc8f0093bfc898202f85"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.295169 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8dngq" event={"ID":"4375904d-94fb-4c2f-804c-5451f7a71c6d","Type":"ContainerStarted","Data":"4b7b82c7cabafbb1eb484dbe13a73bfb2d2e3c4c08e9918033f57f5ac7c93be7"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.295194 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8dngq" event={"ID":"4375904d-94fb-4c2f-804c-5451f7a71c6d","Type":"ContainerStarted","Data":"4c49e4911894f44c9bebd9baf7d135f40fb0ef48e688d44f77c150895dd42188"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.298720 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b434dc6-96c7-4fc0-ba05-a37d48709a08","Type":"ContainerStarted","Data":"a59ac35c6f75e94cc8c874ae2642e1c3a764c0e1e7c7a6f433f75d2e827f3c9f"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.300523 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42b27229-f2f4-46cc-9070-f9e65aacdc0c","Type":"ContainerStarted","Data":"30457eca0f461f186d0d031b5610c4d6639c2aba753e09290ba6b43e25c84c78"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.303018 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" event={"ID":"6c4de367-d702-449a-b5a5-3db1c142a219","Type":"ContainerStarted","Data":"70a10254114eb266dd2c713bc8fa9731240c6c98a6a5227b9459673378692eae"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.307413 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ffnrs" podStartSLOduration=4.09307917 podStartE2EDuration="39.307402059s" podCreationTimestamp="2026-02-15 17:21:50 +0000 UTC" firstStartedPulling="2026-02-15 17:21:52.776296813 +0000 UTC m=+968.719704945" lastFinishedPulling="2026-02-15 17:22:27.990619702 +0000 UTC m=+1003.934027834" observedRunningTime="2026-02-15 17:22:29.304953622 +0000 UTC m=+1005.248361754" watchObservedRunningTime="2026-02-15 17:22:29.307402059 +0000 UTC m=+1005.250810191" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.315343 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9f5444b-8n6qh" event={"ID":"f443582a-cc67-48f1-a3e5-9ba6af0fbec5","Type":"ContainerStarted","Data":"67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.315380 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9f5444b-8n6qh" event={"ID":"f443582a-cc67-48f1-a3e5-9ba6af0fbec5","Type":"ContainerStarted","Data":"1e231b6e9af7cbba5a9cd65de87dc8c740c6f93e341ba7c568cf49ae06364479"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.321462 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb7dd448-vc5x5" event={"ID":"b1bd46e7-0703-49b5-81f2-516568284547","Type":"ContainerStarted","Data":"95aaf9191494923bbff9b7269725daf03c1be0b712b7337799a7ee8656efb40f"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.321490 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb7dd448-vc5x5" event={"ID":"b1bd46e7-0703-49b5-81f2-516568284547","Type":"ContainerStarted","Data":"b7336905a2903494c9192777f0f8a4941e6540f50e987d20f811007ce46fc1d9"} Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.326340 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8dngq" podStartSLOduration=12.326330684 podStartE2EDuration="12.326330684s" podCreationTimestamp="2026-02-15 17:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:29.324647789 +0000 UTC m=+1005.268055921" watchObservedRunningTime="2026-02-15 17:22:29.326330684 +0000 UTC m=+1005.269738816" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.355649 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b9f5444b-8n6qh" podStartSLOduration=30.355624633 podStartE2EDuration="30.355624633s" podCreationTimestamp="2026-02-15 17:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:29.343092921 +0000 UTC m=+1005.286501053" watchObservedRunningTime="2026-02-15 17:22:29.355624633 +0000 UTC m=+1005.299032775" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.574323 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b7cd54d97-kklm5"] Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.583253 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.585965 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.586218 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.590864 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b7cd54d97-kklm5"] Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.628782 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-combined-ca-bundle\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.628894 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-ovndb-tls-certs\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.628968 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vncz\" (UniqueName: \"kubernetes.io/projected/a588849b-011b-4d05-90d1-e5a41644a556-kube-api-access-6vncz\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.629012 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-public-tls-certs\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.629036 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-httpd-config\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.629135 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-internal-tls-certs\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.629185 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-config\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.731001 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-internal-tls-certs\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.731888 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-config\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.731930 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-combined-ca-bundle\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.731995 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-ovndb-tls-certs\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.732025 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vncz\" (UniqueName: \"kubernetes.io/projected/a588849b-011b-4d05-90d1-e5a41644a556-kube-api-access-6vncz\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.732069 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-public-tls-certs\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.732104 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-httpd-config\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.738862 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-internal-tls-certs\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.741913 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-httpd-config\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.743154 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-config\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.752138 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-combined-ca-bundle\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.752712 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-public-tls-certs\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.753210 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-ovndb-tls-certs\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.810317 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vncz\" (UniqueName: \"kubernetes.io/projected/a588849b-011b-4d05-90d1-e5a41644a556-kube-api-access-6vncz\") pod \"neutron-7b7cd54d97-kklm5\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.880448 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-657d9d46dd-264lh"] Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.921319 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.931748 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:22:29 crc kubenswrapper[4585]: I0215 17:22:29.931789 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.331002 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b434dc6-96c7-4fc0-ba05-a37d48709a08","Type":"ContainerStarted","Data":"26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c"} Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.333280 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42b27229-f2f4-46cc-9070-f9e65aacdc0c","Type":"ContainerStarted","Data":"56c04e407c8a9b02a646bf822673cfee5e2dbe2c4ef4d80e4af70bf96d6ee920"} Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.336165 4585 generic.go:334] "Generic (PLEG): container finished" podID="6c4de367-d702-449a-b5a5-3db1c142a219" containerID="2097458816da5299cc074e42360d0c92c543719a2316c4ad53f818567ea9f40d" exitCode=0 Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.336224 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" event={"ID":"6c4de367-d702-449a-b5a5-3db1c142a219","Type":"ContainerDied","Data":"2097458816da5299cc074e42360d0c92c543719a2316c4ad53f818567ea9f40d"} Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.338871 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9f5444b-8n6qh" event={"ID":"f443582a-cc67-48f1-a3e5-9ba6af0fbec5","Type":"ContainerStarted","Data":"e6a81c4c256ad7683acf59828ce484c3e06e042226f140242575aec7b2779784"} Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.351612 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb7dd448-vc5x5" event={"ID":"b1bd46e7-0703-49b5-81f2-516568284547","Type":"ContainerStarted","Data":"03cde210d7e2baae60bc76453feecb1542812009c51804c394167be427f59f34"} Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.398539 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c98d54745-hg9jf" event={"ID":"0ea48db7-287e-47f9-9a08-5b2f153fa269","Type":"ContainerStarted","Data":"a7f96a78f761b3e91def9fd79490ece76d3ce60b3832708e70d799fbf4333e2b"} Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.398821 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c98d54745-hg9jf" podUID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerName="horizon-log" containerID="cri-o://837e67512bff9956662e845ee1325e2cfbdafd3a1bdf16b2ccd38299a5676737" gracePeriod=30 Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.398950 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c98d54745-hg9jf" podUID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerName="horizon" containerID="cri-o://a7f96a78f761b3e91def9fd79490ece76d3ce60b3832708e70d799fbf4333e2b" gracePeriod=30 Feb 15 17:22:30 crc kubenswrapper[4585]: I0215 17:22:30.470926 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c98d54745-hg9jf" podStartSLOduration=6.712650978 podStartE2EDuration="37.470910428s" podCreationTimestamp="2026-02-15 17:21:53 +0000 UTC" firstStartedPulling="2026-02-15 17:21:54.877138274 +0000 UTC m=+970.820546406" lastFinishedPulling="2026-02-15 17:22:25.635397724 +0000 UTC m=+1001.578805856" observedRunningTime="2026-02-15 17:22:30.466954091 +0000 UTC m=+1006.410362223" watchObservedRunningTime="2026-02-15 17:22:30.470910428 +0000 UTC m=+1006.414318560" Feb 15 17:22:31 crc kubenswrapper[4585]: I0215 17:22:31.355752 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5fb7dd448-vc5x5" podStartSLOduration=32.355737443 podStartE2EDuration="32.355737443s" podCreationTimestamp="2026-02-15 17:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:30.522074162 +0000 UTC m=+1006.465482294" watchObservedRunningTime="2026-02-15 17:22:31.355737443 +0000 UTC m=+1007.299145575" Feb 15 17:22:31 crc kubenswrapper[4585]: I0215 17:22:31.366368 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b7cd54d97-kklm5"] Feb 15 17:22:31 crc kubenswrapper[4585]: I0215 17:22:31.423731 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" event={"ID":"6c4de367-d702-449a-b5a5-3db1c142a219","Type":"ContainerStarted","Data":"2c8e51c2515d518daa5c9360b926fa443780f5165f4d0019088940159115be39"} Feb 15 17:22:31 crc kubenswrapper[4585]: I0215 17:22:31.424870 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:31 crc kubenswrapper[4585]: I0215 17:22:31.431291 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-657d9d46dd-264lh" event={"ID":"83f72fb7-0fae-45bd-894b-0b8235e489eb","Type":"ContainerStarted","Data":"b4152a59efd5841b1bfca434f1be40874851466c91d0256932fbdb62beef2fa1"} Feb 15 17:22:31 crc kubenswrapper[4585]: I0215 17:22:31.431314 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-657d9d46dd-264lh" event={"ID":"83f72fb7-0fae-45bd-894b-0b8235e489eb","Type":"ContainerStarted","Data":"d7b9e68df219dc66bd23c119a1e4a50731140a4b69becb60eb08176e300f5b0f"} Feb 15 17:22:31 crc kubenswrapper[4585]: I0215 17:22:31.446586 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" podStartSLOduration=4.446567558 podStartE2EDuration="4.446567558s" podCreationTimestamp="2026-02-15 17:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:31.440892763 +0000 UTC m=+1007.384300895" watchObservedRunningTime="2026-02-15 17:22:31.446567558 +0000 UTC m=+1007.389975690" Feb 15 17:22:32 crc kubenswrapper[4585]: I0215 17:22:32.443668 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerStarted","Data":"936ac29cae1c80c260b0921007b58089ffbc7ab4b0f219a8fb48f11beef758d1"} Feb 15 17:22:32 crc kubenswrapper[4585]: I0215 17:22:32.445493 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cd54d97-kklm5" event={"ID":"a588849b-011b-4d05-90d1-e5a41644a556","Type":"ContainerStarted","Data":"5c1fe7716feeaa7ccd4e30cc4571f877913e667942868c307831d83f5e001e0c"} Feb 15 17:22:33 crc kubenswrapper[4585]: I0215 17:22:33.837975 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.476372 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-657d9d46dd-264lh" event={"ID":"83f72fb7-0fae-45bd-894b-0b8235e489eb","Type":"ContainerStarted","Data":"ac218d79b8c9ccf64945c41c3f474ce122891000dc4674623580d8148f91dba3"} Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.476972 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.483852 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cd54d97-kklm5" event={"ID":"a588849b-011b-4d05-90d1-e5a41644a556","Type":"ContainerStarted","Data":"b093695f5f74e82abd78c320eedc1a1232383d4410243623b6ec957ca33df879"} Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.483890 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cd54d97-kklm5" event={"ID":"a588849b-011b-4d05-90d1-e5a41644a556","Type":"ContainerStarted","Data":"bf9a3c887db9c49ddb20f772e5e627ff23b8d3443a8dd3275cf02ae67bd458ec"} Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.484522 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.538668 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b434dc6-96c7-4fc0-ba05-a37d48709a08","Type":"ContainerStarted","Data":"b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6"} Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.551271 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b7cd54d97-kklm5" podStartSLOduration=5.551252012 podStartE2EDuration="5.551252012s" podCreationTimestamp="2026-02-15 17:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:34.546568844 +0000 UTC m=+1010.489976976" watchObservedRunningTime="2026-02-15 17:22:34.551252012 +0000 UTC m=+1010.494660144" Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.559061 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42b27229-f2f4-46cc-9070-f9e65aacdc0c","Type":"ContainerStarted","Data":"615f51f5291ae2a17d517cde15934119511e4f725afd59cff25b8810d0890142"} Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.559212 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerName="glance-log" containerID="cri-o://56c04e407c8a9b02a646bf822673cfee5e2dbe2c4ef4d80e4af70bf96d6ee920" gracePeriod=30 Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.559536 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerName="glance-httpd" containerID="cri-o://615f51f5291ae2a17d517cde15934119511e4f725afd59cff25b8810d0890142" gracePeriod=30 Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.563030 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-657d9d46dd-264lh" podStartSLOduration=7.563008292 podStartE2EDuration="7.563008292s" podCreationTimestamp="2026-02-15 17:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:34.505817973 +0000 UTC m=+1010.449226105" watchObservedRunningTime="2026-02-15 17:22:34.563008292 +0000 UTC m=+1010.506416424" Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.610578 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=20.610556358 podStartE2EDuration="20.610556358s" podCreationTimestamp="2026-02-15 17:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:34.591527079 +0000 UTC m=+1010.534935211" watchObservedRunningTime="2026-02-15 17:22:34.610556358 +0000 UTC m=+1010.553964490" Feb 15 17:22:34 crc kubenswrapper[4585]: I0215 17:22:34.633737 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=36.633718889 podStartE2EDuration="36.633718889s" podCreationTimestamp="2026-02-15 17:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:34.626016009 +0000 UTC m=+1010.569424141" watchObservedRunningTime="2026-02-15 17:22:34.633718889 +0000 UTC m=+1010.577127021" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.589716 4585 generic.go:334] "Generic (PLEG): container finished" podID="c940d6f6-235b-4817-b022-b5d783c98a5b" containerID="029d2b95e658de5b81a2f9dac9fca4cf2aa230894079bc8f0093bfc898202f85" exitCode=0 Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.589984 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ffnrs" event={"ID":"c940d6f6-235b-4817-b022-b5d783c98a5b","Type":"ContainerDied","Data":"029d2b95e658de5b81a2f9dac9fca4cf2aa230894079bc8f0093bfc898202f85"} Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.603092 4585 generic.go:334] "Generic (PLEG): container finished" podID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerID="615f51f5291ae2a17d517cde15934119511e4f725afd59cff25b8810d0890142" exitCode=0 Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.603119 4585 generic.go:334] "Generic (PLEG): container finished" podID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerID="56c04e407c8a9b02a646bf822673cfee5e2dbe2c4ef4d80e4af70bf96d6ee920" exitCode=143 Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.604620 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42b27229-f2f4-46cc-9070-f9e65aacdc0c","Type":"ContainerDied","Data":"615f51f5291ae2a17d517cde15934119511e4f725afd59cff25b8810d0890142"} Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.604681 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42b27229-f2f4-46cc-9070-f9e65aacdc0c","Type":"ContainerDied","Data":"56c04e407c8a9b02a646bf822673cfee5e2dbe2c4ef4d80e4af70bf96d6ee920"} Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.747630 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.824321 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-combined-ca-bundle\") pod \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.824394 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-httpd-run\") pod \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.824436 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfrdc\" (UniqueName: \"kubernetes.io/projected/42b27229-f2f4-46cc-9070-f9e65aacdc0c-kube-api-access-qfrdc\") pod \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.824501 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-scripts\") pod \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.824570 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-logs\") pod \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.824648 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.824707 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-config-data\") pod \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.824739 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-internal-tls-certs\") pod \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\" (UID: \"42b27229-f2f4-46cc-9070-f9e65aacdc0c\") " Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.826881 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "42b27229-f2f4-46cc-9070-f9e65aacdc0c" (UID: "42b27229-f2f4-46cc-9070-f9e65aacdc0c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.828218 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-logs" (OuterVolumeSpecName: "logs") pod "42b27229-f2f4-46cc-9070-f9e65aacdc0c" (UID: "42b27229-f2f4-46cc-9070-f9e65aacdc0c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.836881 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b27229-f2f4-46cc-9070-f9e65aacdc0c-kube-api-access-qfrdc" (OuterVolumeSpecName: "kube-api-access-qfrdc") pod "42b27229-f2f4-46cc-9070-f9e65aacdc0c" (UID: "42b27229-f2f4-46cc-9070-f9e65aacdc0c"). InnerVolumeSpecName "kube-api-access-qfrdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.846032 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "42b27229-f2f4-46cc-9070-f9e65aacdc0c" (UID: "42b27229-f2f4-46cc-9070-f9e65aacdc0c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.861918 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-scripts" (OuterVolumeSpecName: "scripts") pod "42b27229-f2f4-46cc-9070-f9e65aacdc0c" (UID: "42b27229-f2f4-46cc-9070-f9e65aacdc0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.910988 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "42b27229-f2f4-46cc-9070-f9e65aacdc0c" (UID: "42b27229-f2f4-46cc-9070-f9e65aacdc0c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.927347 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42b27229-f2f4-46cc-9070-f9e65aacdc0c" (UID: "42b27229-f2f4-46cc-9070-f9e65aacdc0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.928643 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.928759 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.928844 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfrdc\" (UniqueName: \"kubernetes.io/projected/42b27229-f2f4-46cc-9070-f9e65aacdc0c-kube-api-access-qfrdc\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.928929 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.929004 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b27229-f2f4-46cc-9070-f9e65aacdc0c-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.929171 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.929235 4585 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.962248 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 15 17:22:35 crc kubenswrapper[4585]: I0215 17:22:35.978277 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-config-data" (OuterVolumeSpecName: "config-data") pod "42b27229-f2f4-46cc-9070-f9e65aacdc0c" (UID: "42b27229-f2f4-46cc-9070-f9e65aacdc0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.031072 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.031102 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b27229-f2f4-46cc-9070-f9e65aacdc0c-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.614335 4585 generic.go:334] "Generic (PLEG): container finished" podID="4375904d-94fb-4c2f-804c-5451f7a71c6d" containerID="4b7b82c7cabafbb1eb484dbe13a73bfb2d2e3c4c08e9918033f57f5ac7c93be7" exitCode=0 Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.614728 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8dngq" event={"ID":"4375904d-94fb-4c2f-804c-5451f7a71c6d","Type":"ContainerDied","Data":"4b7b82c7cabafbb1eb484dbe13a73bfb2d2e3c4c08e9918033f57f5ac7c93be7"} Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.619103 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.622799 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"42b27229-f2f4-46cc-9070-f9e65aacdc0c","Type":"ContainerDied","Data":"30457eca0f461f186d0d031b5610c4d6639c2aba753e09290ba6b43e25c84c78"} Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.622873 4585 scope.go:117] "RemoveContainer" containerID="615f51f5291ae2a17d517cde15934119511e4f725afd59cff25b8810d0890142" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.708942 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.719399 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.728652 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:22:36 crc kubenswrapper[4585]: E0215 17:22:36.729104 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerName="glance-log" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.729115 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerName="glance-log" Feb 15 17:22:36 crc kubenswrapper[4585]: E0215 17:22:36.729157 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerName="glance-httpd" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.729164 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerName="glance-httpd" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.729402 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerName="glance-log" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.729414 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" containerName="glance-httpd" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.730435 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.732894 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.733222 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.733463 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.924626 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b27229-f2f4-46cc-9070-f9e65aacdc0c" path="/var/lib/kubelet/pods/42b27229-f2f4-46cc-9070-f9e65aacdc0c/volumes" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.929629 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqj72\" (UniqueName: \"kubernetes.io/projected/c74b1e0f-71b3-4fe0-9153-4220719171aa-kube-api-access-fqj72\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.929668 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.929698 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.929728 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.929772 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.930044 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-logs\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.931382 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:36 crc kubenswrapper[4585]: I0215 17:22:36.931443 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.032944 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-logs\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.032996 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.033020 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.033089 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqj72\" (UniqueName: \"kubernetes.io/projected/c74b1e0f-71b3-4fe0-9153-4220719171aa-kube-api-access-fqj72\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.033106 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.033123 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.033140 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.033164 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.034376 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.036456 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.048575 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-logs\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.055284 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqj72\" (UniqueName: \"kubernetes.io/projected/c74b1e0f-71b3-4fe0-9153-4220719171aa-kube-api-access-fqj72\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.056865 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.057936 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.064083 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.079509 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.118422 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.358800 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.373933 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.428584 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lxs78"] Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.429041 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" podUID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" containerName="dnsmasq-dns" containerID="cri-o://beff0f4fdea5b332649031eb9ba34409331e0085aceb343594ce8e5376b2ba58" gracePeriod=10 Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.645936 4585 generic.go:334] "Generic (PLEG): container finished" podID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" containerID="beff0f4fdea5b332649031eb9ba34409331e0085aceb343594ce8e5376b2ba58" exitCode=0 Feb 15 17:22:37 crc kubenswrapper[4585]: I0215 17:22:37.645972 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" event={"ID":"f62ae2f9-29ff-4f97-951f-7ef86c863a7d","Type":"ContainerDied","Data":"beff0f4fdea5b332649031eb9ba34409331e0085aceb343594ce8e5376b2ba58"} Feb 15 17:22:39 crc kubenswrapper[4585]: I0215 17:22:39.604682 4585 scope.go:117] "RemoveContainer" containerID="56c04e407c8a9b02a646bf822673cfee5e2dbe2c4ef4d80e4af70bf96d6ee920" Feb 15 17:22:39 crc kubenswrapper[4585]: I0215 17:22:39.935107 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.184:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.184:8443: connect: connection refused" Feb 15 17:22:40 crc kubenswrapper[4585]: I0215 17:22:40.177855 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:22:40 crc kubenswrapper[4585]: I0215 17:22:40.177895 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:22:40 crc kubenswrapper[4585]: I0215 17:22:40.179533 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.185:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.185:8443: connect: connection refused" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.164885 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.168078 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ffnrs" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.168385 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.368868 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-nb\") pod \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.368904 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vknjr\" (UniqueName: \"kubernetes.io/projected/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-kube-api-access-vknjr\") pod \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.368967 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-config-data\") pod \"4375904d-94fb-4c2f-804c-5451f7a71c6d\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369008 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-combined-ca-bundle\") pod \"c940d6f6-235b-4817-b022-b5d783c98a5b\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369048 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-scripts\") pod \"c940d6f6-235b-4817-b022-b5d783c98a5b\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369090 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-sb\") pod \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369108 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-svc\") pod \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369148 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-config-data\") pod \"c940d6f6-235b-4817-b022-b5d783c98a5b\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369170 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-config\") pod \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369185 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-fernet-keys\") pod \"4375904d-94fb-4c2f-804c-5451f7a71c6d\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369233 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bpj7\" (UniqueName: \"kubernetes.io/projected/4375904d-94fb-4c2f-804c-5451f7a71c6d-kube-api-access-7bpj7\") pod \"4375904d-94fb-4c2f-804c-5451f7a71c6d\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369613 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-swift-storage-0\") pod \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\" (UID: \"f62ae2f9-29ff-4f97-951f-7ef86c863a7d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369683 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lbg2\" (UniqueName: \"kubernetes.io/projected/c940d6f6-235b-4817-b022-b5d783c98a5b-kube-api-access-8lbg2\") pod \"c940d6f6-235b-4817-b022-b5d783c98a5b\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369741 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-scripts\") pod \"4375904d-94fb-4c2f-804c-5451f7a71c6d\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369782 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c940d6f6-235b-4817-b022-b5d783c98a5b-logs\") pod \"c940d6f6-235b-4817-b022-b5d783c98a5b\" (UID: \"c940d6f6-235b-4817-b022-b5d783c98a5b\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369808 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-combined-ca-bundle\") pod \"4375904d-94fb-4c2f-804c-5451f7a71c6d\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.369825 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-credential-keys\") pod \"4375904d-94fb-4c2f-804c-5451f7a71c6d\" (UID: \"4375904d-94fb-4c2f-804c-5451f7a71c6d\") " Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.386426 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c940d6f6-235b-4817-b022-b5d783c98a5b-logs" (OuterVolumeSpecName: "logs") pod "c940d6f6-235b-4817-b022-b5d783c98a5b" (UID: "c940d6f6-235b-4817-b022-b5d783c98a5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.412674 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-scripts" (OuterVolumeSpecName: "scripts") pod "c940d6f6-235b-4817-b022-b5d783c98a5b" (UID: "c940d6f6-235b-4817-b022-b5d783c98a5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.412719 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-kube-api-access-vknjr" (OuterVolumeSpecName: "kube-api-access-vknjr") pod "f62ae2f9-29ff-4f97-951f-7ef86c863a7d" (UID: "f62ae2f9-29ff-4f97-951f-7ef86c863a7d"). InnerVolumeSpecName "kube-api-access-vknjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.412789 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c940d6f6-235b-4817-b022-b5d783c98a5b-kube-api-access-8lbg2" (OuterVolumeSpecName: "kube-api-access-8lbg2") pod "c940d6f6-235b-4817-b022-b5d783c98a5b" (UID: "c940d6f6-235b-4817-b022-b5d783c98a5b"). InnerVolumeSpecName "kube-api-access-8lbg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.412835 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4375904d-94fb-4c2f-804c-5451f7a71c6d" (UID: "4375904d-94fb-4c2f-804c-5451f7a71c6d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.413253 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4375904d-94fb-4c2f-804c-5451f7a71c6d-kube-api-access-7bpj7" (OuterVolumeSpecName: "kube-api-access-7bpj7") pod "4375904d-94fb-4c2f-804c-5451f7a71c6d" (UID: "4375904d-94fb-4c2f-804c-5451f7a71c6d"). InnerVolumeSpecName "kube-api-access-7bpj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.434208 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-scripts" (OuterVolumeSpecName: "scripts") pod "4375904d-94fb-4c2f-804c-5451f7a71c6d" (UID: "4375904d-94fb-4c2f-804c-5451f7a71c6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.435370 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4375904d-94fb-4c2f-804c-5451f7a71c6d" (UID: "4375904d-94fb-4c2f-804c-5451f7a71c6d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.475531 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vknjr\" (UniqueName: \"kubernetes.io/projected/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-kube-api-access-vknjr\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.475555 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.475565 4585 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.475574 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bpj7\" (UniqueName: \"kubernetes.io/projected/4375904d-94fb-4c2f-804c-5451f7a71c6d-kube-api-access-7bpj7\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.475607 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lbg2\" (UniqueName: \"kubernetes.io/projected/c940d6f6-235b-4817-b022-b5d783c98a5b-kube-api-access-8lbg2\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.475780 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.475789 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c940d6f6-235b-4817-b022-b5d783c98a5b-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.475798 4585 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.677770 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4375904d-94fb-4c2f-804c-5451f7a71c6d" (UID: "4375904d-94fb-4c2f-804c-5451f7a71c6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.684812 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.692025 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c940d6f6-235b-4817-b022-b5d783c98a5b" (UID: "c940d6f6-235b-4817-b022-b5d783c98a5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.701605 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerStarted","Data":"933becb359a1a0ce857ab253117d920a4490893087b8993cfd3dae34e95f3490"} Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.704121 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ffnrs" event={"ID":"c940d6f6-235b-4817-b022-b5d783c98a5b","Type":"ContainerDied","Data":"98b6382847d962adb8a3485b3a9943a5f3288556e8ae6992a5bb223b936f06df"} Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.704161 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98b6382847d962adb8a3485b3a9943a5f3288556e8ae6992a5bb223b936f06df" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.704215 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ffnrs" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.707365 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8dngq" event={"ID":"4375904d-94fb-4c2f-804c-5451f7a71c6d","Type":"ContainerDied","Data":"4c49e4911894f44c9bebd9baf7d135f40fb0ef48e688d44f77c150895dd42188"} Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.707389 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c49e4911894f44c9bebd9baf7d135f40fb0ef48e688d44f77c150895dd42188" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.707427 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8dngq" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.714879 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" event={"ID":"f62ae2f9-29ff-4f97-951f-7ef86c863a7d","Type":"ContainerDied","Data":"5e1b2c39432bddcb5a508f9d5c2428b8409e0b71e41659a8a5318daf9b159a1f"} Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.714909 4585 scope.go:117] "RemoveContainer" containerID="beff0f4fdea5b332649031eb9ba34409331e0085aceb343594ce8e5376b2ba58" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.714977 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.741971 4585 scope.go:117] "RemoveContainer" containerID="9967ef3aa0741ff7ceb243269fe1886601e702aa586c4bdf6c890a1a76fcf80f" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.768984 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-config-data" (OuterVolumeSpecName: "config-data") pod "4375904d-94fb-4c2f-804c-5451f7a71c6d" (UID: "4375904d-94fb-4c2f-804c-5451f7a71c6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.786309 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4375904d-94fb-4c2f-804c-5451f7a71c6d-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.786335 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.836897 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.852202 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f62ae2f9-29ff-4f97-951f-7ef86c863a7d" (UID: "f62ae2f9-29ff-4f97-951f-7ef86c863a7d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.861910 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-config-data" (OuterVolumeSpecName: "config-data") pod "c940d6f6-235b-4817-b022-b5d783c98a5b" (UID: "c940d6f6-235b-4817-b022-b5d783c98a5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.862095 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f62ae2f9-29ff-4f97-951f-7ef86c863a7d" (UID: "f62ae2f9-29ff-4f97-951f-7ef86c863a7d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.873918 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f62ae2f9-29ff-4f97-951f-7ef86c863a7d" (UID: "f62ae2f9-29ff-4f97-951f-7ef86c863a7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.883716 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-config" (OuterVolumeSpecName: "config") pod "f62ae2f9-29ff-4f97-951f-7ef86c863a7d" (UID: "f62ae2f9-29ff-4f97-951f-7ef86c863a7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.888618 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.888703 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.888772 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.888832 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c940d6f6-235b-4817-b022-b5d783c98a5b-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.888884 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:42 crc kubenswrapper[4585]: W0215 17:22:42.906793 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc74b1e0f_71b3_4fe0_9153_4220719171aa.slice/crio-337a8c43c4571aa5e0a1f368dc7c7b23776e8f6d81c46bc268d7ee647bd021bb WatchSource:0}: Error finding container 337a8c43c4571aa5e0a1f368dc7c7b23776e8f6d81c46bc268d7ee647bd021bb: Status 404 returned error can't find the container with id 337a8c43c4571aa5e0a1f368dc7c7b23776e8f6d81c46bc268d7ee647bd021bb Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.952035 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f62ae2f9-29ff-4f97-951f-7ef86c863a7d" (UID: "f62ae2f9-29ff-4f97-951f-7ef86c863a7d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:42 crc kubenswrapper[4585]: I0215 17:22:42.991262 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f62ae2f9-29ff-4f97-951f-7ef86c863a7d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.157633 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lxs78"] Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.166815 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lxs78"] Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.285475 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-756b6f8c5f-5zd85"] Feb 15 17:22:43 crc kubenswrapper[4585]: E0215 17:22:43.288923 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" containerName="init" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.288948 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" containerName="init" Feb 15 17:22:43 crc kubenswrapper[4585]: E0215 17:22:43.288963 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4375904d-94fb-4c2f-804c-5451f7a71c6d" containerName="keystone-bootstrap" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.288969 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4375904d-94fb-4c2f-804c-5451f7a71c6d" containerName="keystone-bootstrap" Feb 15 17:22:43 crc kubenswrapper[4585]: E0215 17:22:43.288983 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" containerName="dnsmasq-dns" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.288989 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" containerName="dnsmasq-dns" Feb 15 17:22:43 crc kubenswrapper[4585]: E0215 17:22:43.289006 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c940d6f6-235b-4817-b022-b5d783c98a5b" containerName="placement-db-sync" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.289013 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c940d6f6-235b-4817-b022-b5d783c98a5b" containerName="placement-db-sync" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.289215 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c940d6f6-235b-4817-b022-b5d783c98a5b" containerName="placement-db-sync" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.289234 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" containerName="dnsmasq-dns" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.289241 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="4375904d-94fb-4c2f-804c-5451f7a71c6d" containerName="keystone-bootstrap" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.289862 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.298880 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-config-data\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.298939 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-fernet-keys\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.298972 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-credential-keys\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.299873 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-756b6f8c5f-5zd85"] Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.300389 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-internal-tls-certs\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.300415 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-combined-ca-bundle\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.300435 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-scripts\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.300529 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7l6p\" (UniqueName: \"kubernetes.io/projected/15f27cce-5856-41d0-8528-95eba7431a98-kube-api-access-v7l6p\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.300576 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-public-tls-certs\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.301735 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6w8mf" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.301952 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.302107 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.302214 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.302351 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.302445 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.404866 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7l6p\" (UniqueName: \"kubernetes.io/projected/15f27cce-5856-41d0-8528-95eba7431a98-kube-api-access-v7l6p\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.404909 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-public-tls-certs\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.404933 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-config-data\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.404969 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-fernet-keys\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.405002 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-credential-keys\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.405048 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-internal-tls-certs\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.405064 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-combined-ca-bundle\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.405084 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-scripts\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.410425 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fd7545564-bhhq2"] Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.422296 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-scripts\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.446508 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7l6p\" (UniqueName: \"kubernetes.io/projected/15f27cce-5856-41d0-8528-95eba7431a98-kube-api-access-v7l6p\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.463727 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-public-tls-certs\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.465477 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-internal-tls-certs\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.465813 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-credential-keys\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.465972 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-config-data\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.472710 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-fernet-keys\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.478146 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fd7545564-bhhq2"] Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.478551 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.487174 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.487459 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.487682 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.487856 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fjt9j" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.489444 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.501296 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f27cce-5856-41d0-8528-95eba7431a98-combined-ca-bundle\") pod \"keystone-756b6f8c5f-5zd85\" (UID: \"15f27cce-5856-41d0-8528-95eba7431a98\") " pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.612614 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-public-tls-certs\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.612661 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-scripts\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.612679 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-config-data\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.612703 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bb78e4-1277-494a-a47e-3a2c4ce18228-logs\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.612769 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-internal-tls-certs\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.612792 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-combined-ca-bundle\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.612811 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnplt\" (UniqueName: \"kubernetes.io/projected/77bb78e4-1277-494a-a47e-3a2c4ce18228-kube-api-access-vnplt\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.657098 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.749487 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5dfc4c95db-jlklr"] Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.751202 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754269 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-scripts\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754316 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bb78e4-1277-494a-a47e-3a2c4ce18228-logs\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754338 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-public-tls-certs\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754384 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vcp8\" (UniqueName: \"kubernetes.io/projected/60d01fff-4dd5-4cc0-9cce-06d41728c238-kube-api-access-7vcp8\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754419 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-internal-tls-certs\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754442 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-combined-ca-bundle\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754461 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnplt\" (UniqueName: \"kubernetes.io/projected/77bb78e4-1277-494a-a47e-3a2c4ce18228-kube-api-access-vnplt\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754483 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d01fff-4dd5-4cc0-9cce-06d41728c238-logs\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754503 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-internal-tls-certs\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754519 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-combined-ca-bundle\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.754570 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-config-data\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.759345 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bb78e4-1277-494a-a47e-3a2c4ce18228-logs\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.764633 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-public-tls-certs\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.764713 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-scripts\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.764734 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-config-data\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.767687 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-combined-ca-bundle\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.773558 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-scripts\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.774341 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-config-data\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.776941 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5dfc4c95db-jlklr"] Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.784260 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-internal-tls-certs\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.800919 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-public-tls-certs\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.811629 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5zw55" event={"ID":"3701151b-dc31-421f-a1e1-9d694e13bc86","Type":"ContainerStarted","Data":"5ff4d5723ca71a31e08385d92efbd2c524e765f3b6516afd8a057d919d6d70f2"} Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.846581 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5zw55" podStartSLOduration=4.511144405 podStartE2EDuration="53.846562251s" podCreationTimestamp="2026-02-15 17:21:50 +0000 UTC" firstStartedPulling="2026-02-15 17:21:52.841309376 +0000 UTC m=+968.784717508" lastFinishedPulling="2026-02-15 17:22:42.176727222 +0000 UTC m=+1018.120135354" observedRunningTime="2026-02-15 17:22:43.845539032 +0000 UTC m=+1019.788947164" watchObservedRunningTime="2026-02-15 17:22:43.846562251 +0000 UTC m=+1019.789970383" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.852036 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74b1e0f-71b3-4fe0-9153-4220719171aa","Type":"ContainerStarted","Data":"337a8c43c4571aa5e0a1f368dc7c7b23776e8f6d81c46bc268d7ee647bd021bb"} Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.854884 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnplt\" (UniqueName: \"kubernetes.io/projected/77bb78e4-1277-494a-a47e-3a2c4ce18228-kube-api-access-vnplt\") pod \"placement-fd7545564-bhhq2\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.882073 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vcp8\" (UniqueName: \"kubernetes.io/projected/60d01fff-4dd5-4cc0-9cce-06d41728c238-kube-api-access-7vcp8\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.882287 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d01fff-4dd5-4cc0-9cce-06d41728c238-logs\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.882378 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-internal-tls-certs\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.882457 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-combined-ca-bundle\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.882570 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-config-data\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.882684 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-scripts\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.882777 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-public-tls-certs\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.887022 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60d01fff-4dd5-4cc0-9cce-06d41728c238-logs\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.904923 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.907942 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-config-data\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.908255 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-scripts\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.909667 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-internal-tls-certs\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.912690 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vcp8\" (UniqueName: \"kubernetes.io/projected/60d01fff-4dd5-4cc0-9cce-06d41728c238-kube-api-access-7vcp8\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.912779 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-combined-ca-bundle\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.913135 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d01fff-4dd5-4cc0-9cce-06d41728c238-public-tls-certs\") pod \"placement-5dfc4c95db-jlklr\" (UID: \"60d01fff-4dd5-4cc0-9cce-06d41728c238\") " pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:43 crc kubenswrapper[4585]: I0215 17:22:43.946117 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.443888 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.445230 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.445242 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.445250 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.505551 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-756b6f8c5f-5zd85"] Feb 15 17:22:44 crc kubenswrapper[4585]: W0215 17:22:44.550320 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15f27cce_5856_41d0_8528_95eba7431a98.slice/crio-f0f0c30092a8bc8cf6237e5bf1217d092303f8805ffa9950358b7106879016cf WatchSource:0}: Error finding container f0f0c30092a8bc8cf6237e5bf1217d092303f8805ffa9950358b7106879016cf: Status 404 returned error can't find the container with id f0f0c30092a8bc8cf6237e5bf1217d092303f8805ffa9950358b7106879016cf Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.616485 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.617157 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.827764 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fd7545564-bhhq2"] Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.839498 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5dfc4c95db-jlklr"] Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.892481 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" path="/var/lib/kubelet/pods/f62ae2f9-29ff-4f97-951f-7ef86c863a7d/volumes" Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.895186 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74b1e0f-71b3-4fe0-9153-4220719171aa","Type":"ContainerStarted","Data":"04b9ac1a42f254f32273d0d1e7d8129d13abdc39ccffb7beeee7e40cc40d5351"} Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.921814 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4vdfm" event={"ID":"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff","Type":"ContainerStarted","Data":"a65d5a4a553c51156a7136c874dde7a15f017f6e6b2e80ef2e0797d2a81b178c"} Feb 15 17:22:44 crc kubenswrapper[4585]: I0215 17:22:44.937584 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-756b6f8c5f-5zd85" event={"ID":"15f27cce-5856-41d0-8528-95eba7431a98","Type":"ContainerStarted","Data":"f0f0c30092a8bc8cf6237e5bf1217d092303f8805ffa9950358b7106879016cf"} Feb 15 17:22:45 crc kubenswrapper[4585]: I0215 17:22:45.102949 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-4vdfm" podStartSLOduration=4.528979642 podStartE2EDuration="55.10292908s" podCreationTimestamp="2026-02-15 17:21:50 +0000 UTC" firstStartedPulling="2026-02-15 17:21:51.921225423 +0000 UTC m=+967.864633555" lastFinishedPulling="2026-02-15 17:22:42.495174861 +0000 UTC m=+1018.438582993" observedRunningTime="2026-02-15 17:22:45.01008221 +0000 UTC m=+1020.953490342" watchObservedRunningTime="2026-02-15 17:22:45.10292908 +0000 UTC m=+1021.046337212" Feb 15 17:22:45 crc kubenswrapper[4585]: I0215 17:22:45.957830 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fd7545564-bhhq2" event={"ID":"77bb78e4-1277-494a-a47e-3a2c4ce18228","Type":"ContainerStarted","Data":"293806b16be4a9fa335a0560f9a9b284c69a1d4a12284279b8cc229ee00ab421"} Feb 15 17:22:45 crc kubenswrapper[4585]: I0215 17:22:45.958263 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fd7545564-bhhq2" event={"ID":"77bb78e4-1277-494a-a47e-3a2c4ce18228","Type":"ContainerStarted","Data":"8752aad40d4846c3b5a6510391f9be277b35c2b5abc35b1a9ba4b101b2d9f42e"} Feb 15 17:22:45 crc kubenswrapper[4585]: I0215 17:22:45.971053 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74b1e0f-71b3-4fe0-9153-4220719171aa","Type":"ContainerStarted","Data":"5b47d60a7a8ba02e8fc7332ebafe7d90aeefdc80c41921e827c3931aa8648964"} Feb 15 17:22:45 crc kubenswrapper[4585]: I0215 17:22:45.974447 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5dfc4c95db-jlklr" event={"ID":"60d01fff-4dd5-4cc0-9cce-06d41728c238","Type":"ContainerStarted","Data":"239829212e60722dcf7cb2816523a9b0e073290f5e33ed51732a62b2d7bea8f0"} Feb 15 17:22:45 crc kubenswrapper[4585]: I0215 17:22:45.974477 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5dfc4c95db-jlklr" event={"ID":"60d01fff-4dd5-4cc0-9cce-06d41728c238","Type":"ContainerStarted","Data":"46380b17469dda4ffb16e68c3a8e72215c8843643444114ed0477f779595a922"} Feb 15 17:22:45 crc kubenswrapper[4585]: I0215 17:22:45.977970 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-756b6f8c5f-5zd85" event={"ID":"15f27cce-5856-41d0-8528-95eba7431a98","Type":"ContainerStarted","Data":"ba1edd4f9603bf2b7b227fd9a1c545f63f347a2076778c60d0d49075eaa82818"} Feb 15 17:22:45 crc kubenswrapper[4585]: I0215 17:22:45.978840 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:22:46 crc kubenswrapper[4585]: I0215 17:22:46.100015 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.099997244 podStartE2EDuration="10.099997244s" podCreationTimestamp="2026-02-15 17:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:46.029079171 +0000 UTC m=+1021.972487303" watchObservedRunningTime="2026-02-15 17:22:46.099997244 +0000 UTC m=+1022.043405376" Feb 15 17:22:46 crc kubenswrapper[4585]: I0215 17:22:46.452410 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-lxs78" podUID="f62ae2f9-29ff-4f97-951f-7ef86c863a7d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.178:5353: i/o timeout" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.007833 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fd7545564-bhhq2" event={"ID":"77bb78e4-1277-494a-a47e-3a2c4ce18228","Type":"ContainerStarted","Data":"4b4eec435e566ac9235ee7e885ef157182b30dadaf697753940f4d3947eab502"} Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.008773 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.009091 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.031690 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5dfc4c95db-jlklr" event={"ID":"60d01fff-4dd5-4cc0-9cce-06d41728c238","Type":"ContainerStarted","Data":"ff72a59789b1ac5daff71e0d39d200d5b677486e80e8db7dfb2d8c14714af1c2"} Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.031729 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.032807 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.042551 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-756b6f8c5f-5zd85" podStartSLOduration=4.042537602 podStartE2EDuration="4.042537602s" podCreationTimestamp="2026-02-15 17:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:46.085243252 +0000 UTC m=+1022.028651384" watchObservedRunningTime="2026-02-15 17:22:47.042537602 +0000 UTC m=+1022.985945734" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.046298 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-fd7545564-bhhq2" podStartSLOduration=4.046288254 podStartE2EDuration="4.046288254s" podCreationTimestamp="2026-02-15 17:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:47.031640935 +0000 UTC m=+1022.975049077" watchObservedRunningTime="2026-02-15 17:22:47.046288254 +0000 UTC m=+1022.989696386" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.054836 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5dfc4c95db-jlklr" podStartSLOduration=4.054826046 podStartE2EDuration="4.054826046s" podCreationTimestamp="2026-02-15 17:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:22:47.051585808 +0000 UTC m=+1022.994993940" watchObservedRunningTime="2026-02-15 17:22:47.054826046 +0000 UTC m=+1022.998234178" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.374564 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.374677 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.473538 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:47 crc kubenswrapper[4585]: I0215 17:22:47.477751 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:48 crc kubenswrapper[4585]: I0215 17:22:48.046857 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:48 crc kubenswrapper[4585]: I0215 17:22:48.046930 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:49 crc kubenswrapper[4585]: I0215 17:22:49.055029 4585 generic.go:334] "Generic (PLEG): container finished" podID="3701151b-dc31-421f-a1e1-9d694e13bc86" containerID="5ff4d5723ca71a31e08385d92efbd2c524e765f3b6516afd8a057d919d6d70f2" exitCode=0 Feb 15 17:22:49 crc kubenswrapper[4585]: I0215 17:22:49.055118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5zw55" event={"ID":"3701151b-dc31-421f-a1e1-9d694e13bc86","Type":"ContainerDied","Data":"5ff4d5723ca71a31e08385d92efbd2c524e765f3b6516afd8a057d919d6d70f2"} Feb 15 17:22:49 crc kubenswrapper[4585]: I0215 17:22:49.857165 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 15 17:22:49 crc kubenswrapper[4585]: I0215 17:22:49.869395 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 15 17:22:49 crc kubenswrapper[4585]: I0215 17:22:49.933367 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.184:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.184:8443: connect: connection refused" Feb 15 17:22:50 crc kubenswrapper[4585]: I0215 17:22:50.177433 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.185:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.185:8443: connect: connection refused" Feb 15 17:22:51 crc kubenswrapper[4585]: I0215 17:22:51.346401 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:51 crc kubenswrapper[4585]: I0215 17:22:51.359302 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 15 17:22:53 crc kubenswrapper[4585]: I0215 17:22:53.118180 4585 generic.go:334] "Generic (PLEG): container finished" podID="ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" containerID="a65d5a4a553c51156a7136c874dde7a15f017f6e6b2e80ef2e0797d2a81b178c" exitCode=0 Feb 15 17:22:53 crc kubenswrapper[4585]: I0215 17:22:53.118245 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4vdfm" event={"ID":"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff","Type":"ContainerDied","Data":"a65d5a4a553c51156a7136c874dde7a15f017f6e6b2e80ef2e0797d2a81b178c"} Feb 15 17:22:55 crc kubenswrapper[4585]: I0215 17:22:55.967982 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5zw55" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.016521 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.125270 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-config-data\") pod \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.125341 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k45br\" (UniqueName: \"kubernetes.io/projected/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-kube-api-access-k45br\") pod \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.125384 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-combined-ca-bundle\") pod \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.125416 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-combined-ca-bundle\") pod \"3701151b-dc31-421f-a1e1-9d694e13bc86\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.125465 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-db-sync-config-data\") pod \"3701151b-dc31-421f-a1e1-9d694e13bc86\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.125493 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-db-sync-config-data\") pod \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.125539 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-etc-machine-id\") pod \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.125635 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf6r6\" (UniqueName: \"kubernetes.io/projected/3701151b-dc31-421f-a1e1-9d694e13bc86-kube-api-access-kf6r6\") pod \"3701151b-dc31-421f-a1e1-9d694e13bc86\" (UID: \"3701151b-dc31-421f-a1e1-9d694e13bc86\") " Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.125710 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-scripts\") pod \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\" (UID: \"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff\") " Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.128331 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" (UID: "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.131443 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-scripts" (OuterVolumeSpecName: "scripts") pod "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" (UID: "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.135034 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-kube-api-access-k45br" (OuterVolumeSpecName: "kube-api-access-k45br") pod "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" (UID: "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff"). InnerVolumeSpecName "kube-api-access-k45br". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.137952 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3701151b-dc31-421f-a1e1-9d694e13bc86" (UID: "3701151b-dc31-421f-a1e1-9d694e13bc86"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.139507 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" (UID: "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.141525 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3701151b-dc31-421f-a1e1-9d694e13bc86-kube-api-access-kf6r6" (OuterVolumeSpecName: "kube-api-access-kf6r6") pod "3701151b-dc31-421f-a1e1-9d694e13bc86" (UID: "3701151b-dc31-421f-a1e1-9d694e13bc86"). InnerVolumeSpecName "kube-api-access-kf6r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.176026 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" (UID: "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.181062 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4vdfm" event={"ID":"ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff","Type":"ContainerDied","Data":"646dc1e86fa59bc74cc670719e08b87183dcd76de730b78e69ccb85da6ea36b6"} Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.181098 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="646dc1e86fa59bc74cc670719e08b87183dcd76de730b78e69ccb85da6ea36b6" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.181151 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4vdfm" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.186328 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerStarted","Data":"61d5708c237318de4c5b54253fe38a8a6e5ad9f6ad2ff99f79ae888bcd7dfafb"} Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.186382 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="ceilometer-central-agent" containerID="cri-o://a7500de9e2f80a57638642b20ce7bb90902d5ee491b2f0946a6106887712c69d" gracePeriod=30 Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.186425 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.186935 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="ceilometer-notification-agent" containerID="cri-o://936ac29cae1c80c260b0921007b58089ffbc7ab4b0f219a8fb48f11beef758d1" gracePeriod=30 Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.187003 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="sg-core" containerID="cri-o://933becb359a1a0ce857ab253117d920a4490893087b8993cfd3dae34e95f3490" gracePeriod=30 Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.186469 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="proxy-httpd" containerID="cri-o://61d5708c237318de4c5b54253fe38a8a6e5ad9f6ad2ff99f79ae888bcd7dfafb" gracePeriod=30 Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.200281 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5zw55" event={"ID":"3701151b-dc31-421f-a1e1-9d694e13bc86","Type":"ContainerDied","Data":"ae7f85c8ca84766c0484becc447138fe744fb88beef840b61bcea59d60e7c4b4"} Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.200320 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7f85c8ca84766c0484becc447138fe744fb88beef840b61bcea59d60e7c4b4" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.200373 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5zw55" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.207475 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3701151b-dc31-421f-a1e1-9d694e13bc86" (UID: "3701151b-dc31-421f-a1e1-9d694e13bc86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.212766 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.825515443 podStartE2EDuration="1m6.212751401s" podCreationTimestamp="2026-02-15 17:21:50 +0000 UTC" firstStartedPulling="2026-02-15 17:21:52.447693996 +0000 UTC m=+968.391102128" lastFinishedPulling="2026-02-15 17:22:55.834929954 +0000 UTC m=+1031.778338086" observedRunningTime="2026-02-15 17:22:56.207719313 +0000 UTC m=+1032.151127445" watchObservedRunningTime="2026-02-15 17:22:56.212751401 +0000 UTC m=+1032.156159533" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.224699 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-config-data" (OuterVolumeSpecName: "config-data") pod "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" (UID: "ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.227714 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.227737 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.227747 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k45br\" (UniqueName: \"kubernetes.io/projected/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-kube-api-access-k45br\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.227758 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.227769 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.227777 4585 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3701151b-dc31-421f-a1e1-9d694e13bc86-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.227785 4585 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.227795 4585 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:56 crc kubenswrapper[4585]: I0215 17:22:56.227803 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf6r6\" (UniqueName: \"kubernetes.io/projected/3701151b-dc31-421f-a1e1-9d694e13bc86-kube-api-access-kf6r6\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.209924 4585 generic.go:334] "Generic (PLEG): container finished" podID="624871de-b62e-4eae-a220-a5d34995919d" containerID="61d5708c237318de4c5b54253fe38a8a6e5ad9f6ad2ff99f79ae888bcd7dfafb" exitCode=0 Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.210173 4585 generic.go:334] "Generic (PLEG): container finished" podID="624871de-b62e-4eae-a220-a5d34995919d" containerID="933becb359a1a0ce857ab253117d920a4490893087b8993cfd3dae34e95f3490" exitCode=2 Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.210181 4585 generic.go:334] "Generic (PLEG): container finished" podID="624871de-b62e-4eae-a220-a5d34995919d" containerID="a7500de9e2f80a57638642b20ce7bb90902d5ee491b2f0946a6106887712c69d" exitCode=0 Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.210198 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerDied","Data":"61d5708c237318de4c5b54253fe38a8a6e5ad9f6ad2ff99f79ae888bcd7dfafb"} Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.210220 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerDied","Data":"933becb359a1a0ce857ab253117d920a4490893087b8993cfd3dae34e95f3490"} Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.210229 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerDied","Data":"a7500de9e2f80a57638642b20ce7bb90902d5ee491b2f0946a6106887712c69d"} Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.413381 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-666c6c699f-9zzq7"] Feb 15 17:22:57 crc kubenswrapper[4585]: E0215 17:22:57.413883 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3701151b-dc31-421f-a1e1-9d694e13bc86" containerName="barbican-db-sync" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.413900 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="3701151b-dc31-421f-a1e1-9d694e13bc86" containerName="barbican-db-sync" Feb 15 17:22:57 crc kubenswrapper[4585]: E0215 17:22:57.413917 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" containerName="cinder-db-sync" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.413923 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" containerName="cinder-db-sync" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.414153 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="3701151b-dc31-421f-a1e1-9d694e13bc86" containerName="barbican-db-sync" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.414177 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" containerName="cinder-db-sync" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.415229 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.426365 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.426569 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.426701 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-h7p4c" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.429818 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.431643 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.434915 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.435231 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.435334 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tlwjs" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.435430 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.472555 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-765946969b-cdgqp"] Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.498421 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.516114 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.573854 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e924d6a-e504-41b5-8268-f6df32a3e507-config-data-custom\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.573945 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-scripts\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.573973 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f9940b-28db-47ee-9a5b-771a1f757af5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.574004 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e924d6a-e504-41b5-8268-f6df32a3e507-combined-ca-bundle\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.574027 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.574051 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rt48\" (UniqueName: \"kubernetes.io/projected/32f9940b-28db-47ee-9a5b-771a1f757af5-kube-api-access-9rt48\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.574080 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.574114 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e924d6a-e504-41b5-8268-f6df32a3e507-logs\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.574131 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.574183 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9fpw\" (UniqueName: \"kubernetes.io/projected/8e924d6a-e504-41b5-8268-f6df32a3e507-kube-api-access-g9fpw\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.574233 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e924d6a-e504-41b5-8268-f6df32a3e507-config-data\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679508 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e924d6a-e504-41b5-8268-f6df32a3e507-config-data\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679556 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svdpq\" (UniqueName: \"kubernetes.io/projected/42089924-7f2c-40bd-a930-74f9ae10b784-kube-api-access-svdpq\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679612 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42089924-7f2c-40bd-a930-74f9ae10b784-config-data\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679657 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e924d6a-e504-41b5-8268-f6df32a3e507-config-data-custom\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679720 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42089924-7f2c-40bd-a930-74f9ae10b784-logs\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679738 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-scripts\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679763 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f9940b-28db-47ee-9a5b-771a1f757af5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679790 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e924d6a-e504-41b5-8268-f6df32a3e507-combined-ca-bundle\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679819 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679852 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rt48\" (UniqueName: \"kubernetes.io/projected/32f9940b-28db-47ee-9a5b-771a1f757af5-kube-api-access-9rt48\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679876 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42089924-7f2c-40bd-a930-74f9ae10b784-combined-ca-bundle\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679899 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679930 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e924d6a-e504-41b5-8268-f6df32a3e507-logs\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679949 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.679993 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42089924-7f2c-40bd-a930-74f9ae10b784-config-data-custom\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.680026 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9fpw\" (UniqueName: \"kubernetes.io/projected/8e924d6a-e504-41b5-8268-f6df32a3e507-kube-api-access-g9fpw\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.690741 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f9940b-28db-47ee-9a5b-771a1f757af5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.726960 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e924d6a-e504-41b5-8268-f6df32a3e507-logs\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.727572 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-scripts\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.728047 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e924d6a-e504-41b5-8268-f6df32a3e507-combined-ca-bundle\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.728173 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-666c6c699f-9zzq7"] Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.731686 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.733415 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e924d6a-e504-41b5-8268-f6df32a3e507-config-data-custom\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.755823 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e924d6a-e504-41b5-8268-f6df32a3e507-config-data\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.758477 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.758540 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.766783 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.770380 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.789704 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42089924-7f2c-40bd-a930-74f9ae10b784-config-data-custom\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.789784 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svdpq\" (UniqueName: \"kubernetes.io/projected/42089924-7f2c-40bd-a930-74f9ae10b784-kube-api-access-svdpq\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.789817 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42089924-7f2c-40bd-a930-74f9ae10b784-config-data\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.789886 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42089924-7f2c-40bd-a930-74f9ae10b784-logs\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.789928 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42089924-7f2c-40bd-a930-74f9ae10b784-combined-ca-bundle\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.801175 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42089924-7f2c-40bd-a930-74f9ae10b784-logs\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.801695 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9fpw\" (UniqueName: \"kubernetes.io/projected/8e924d6a-e504-41b5-8268-f6df32a3e507-kube-api-access-g9fpw\") pod \"barbican-worker-666c6c699f-9zzq7\" (UID: \"8e924d6a-e504-41b5-8268-f6df32a3e507\") " pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.805347 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42089924-7f2c-40bd-a930-74f9ae10b784-config-data-custom\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.805386 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-765946969b-cdgqp"] Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.824008 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rt48\" (UniqueName: \"kubernetes.io/projected/32f9940b-28db-47ee-9a5b-771a1f757af5-kube-api-access-9rt48\") pod \"cinder-scheduler-0\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.830860 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.842629 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42089924-7f2c-40bd-a930-74f9ae10b784-config-data\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.848502 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42089924-7f2c-40bd-a930-74f9ae10b784-combined-ca-bundle\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.866974 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-666c6c699f-9zzq7" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.874098 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-774db89647-sdw5c"] Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.875899 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.884822 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svdpq\" (UniqueName: \"kubernetes.io/projected/42089924-7f2c-40bd-a930-74f9ae10b784-kube-api-access-svdpq\") pod \"barbican-keystone-listener-765946969b-cdgqp\" (UID: \"42089924-7f2c-40bd-a930-74f9ae10b784\") " pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.885217 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-765946969b-cdgqp" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.950668 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-774db89647-sdw5c"] Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.996928 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-svc\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.996967 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.997009 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-config\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.997054 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrshh\" (UniqueName: \"kubernetes.io/projected/25ce1253-6c61-4260-ae89-b2ec88a78b73-kube-api-access-rrshh\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.997109 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:57 crc kubenswrapper[4585]: I0215 17:22:57.997141 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.022660 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-sdw5c"] Feb 15 17:22:58 crc kubenswrapper[4585]: E0215 17:22:58.023552 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-rrshh ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-774db89647-sdw5c" podUID="25ce1253-6c61-4260-ae89-b2ec88a78b73" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.037715 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-66d8b59fb-fq9sk"] Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.039516 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.042813 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.075016 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7gtqn"] Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.076716 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.098889 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.098934 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-combined-ca-bundle\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.098953 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffv7v\" (UniqueName: \"kubernetes.io/projected/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-kube-api-access-ffv7v\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.098975 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-config\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.098994 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099015 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099033 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099053 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-svc\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099071 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrshh\" (UniqueName: \"kubernetes.io/projected/25ce1253-6c61-4260-ae89-b2ec88a78b73-kube-api-access-rrshh\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099120 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099149 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099180 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-logs\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099221 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-config\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099247 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wblvs\" (UniqueName: \"kubernetes.io/projected/23c49d13-f495-4583-8132-00a2af47b3ef-kube-api-access-wblvs\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099275 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data-custom\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099296 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-svc\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.099317 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.100157 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.101164 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.101723 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.102181 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-svc\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.102704 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-config\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.112867 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66d8b59fb-fq9sk"] Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.133755 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrshh\" (UniqueName: \"kubernetes.io/projected/25ce1253-6c61-4260-ae89-b2ec88a78b73-kube-api-access-rrshh\") pod \"dnsmasq-dns-774db89647-sdw5c\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.146074 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7gtqn"] Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.202562 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.202841 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-combined-ca-bundle\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.202875 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffv7v\" (UniqueName: \"kubernetes.io/projected/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-kube-api-access-ffv7v\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.202898 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.202916 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.202936 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.202957 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-svc\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.203039 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-logs\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.203080 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-config\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.203114 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wblvs\" (UniqueName: \"kubernetes.io/projected/23c49d13-f495-4583-8132-00a2af47b3ef-kube-api-access-wblvs\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.203139 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data-custom\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.204610 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.205005 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-logs\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.205633 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.205640 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-config\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.206147 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-svc\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.211786 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.212290 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.220818 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.240713 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.252334 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.258036 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.275192 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.278183 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffv7v\" (UniqueName: \"kubernetes.io/projected/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-kube-api-access-ffv7v\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.278467 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-combined-ca-bundle\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.279202 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data-custom\") pod \"barbican-api-66d8b59fb-fq9sk\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.279850 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wblvs\" (UniqueName: \"kubernetes.io/projected/23c49d13-f495-4583-8132-00a2af47b3ef-kube-api-access-wblvs\") pod \"dnsmasq-dns-6578955fd5-7gtqn\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.307735 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-scripts\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.308017 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.308108 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.308211 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/777e7a72-1307-4f36-ae59-77ca3d534006-etc-machine-id\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.308304 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j87ft\" (UniqueName: \"kubernetes.io/projected/777e7a72-1307-4f36-ae59-77ca3d534006-kube-api-access-j87ft\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.308378 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data-custom\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.308447 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777e7a72-1307-4f36-ae59-77ca3d534006-logs\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.333656 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b7cd54d97-kklm5"] Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.333924 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b7cd54d97-kklm5" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-api" containerID="cri-o://bf9a3c887db9c49ddb20f772e5e627ff23b8d3443a8dd3275cf02ae67bd458ec" gracePeriod=30 Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.334789 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b7cd54d97-kklm5" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-httpd" containerID="cri-o://b093695f5f74e82abd78c320eedc1a1232383d4410243623b6ec957ca33df879" gracePeriod=30 Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.378132 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.379472 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-65d8658977-kc9xn"] Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.381272 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.405507 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65d8658977-kc9xn"] Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.410029 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-scripts\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.414990 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-internal-tls-certs\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.415167 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-public-tls-certs\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.415263 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-242fb\" (UniqueName: \"kubernetes.io/projected/732fcac3-39e1-4937-9a97-f243a37bc41b-kube-api-access-242fb\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.415373 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.415471 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-config\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.415562 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.415668 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-ovndb-tls-certs\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.415815 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/777e7a72-1307-4f36-ae59-77ca3d534006-etc-machine-id\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.415956 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j87ft\" (UniqueName: \"kubernetes.io/projected/777e7a72-1307-4f36-ae59-77ca3d534006-kube-api-access-j87ft\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.416071 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data-custom\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.416164 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777e7a72-1307-4f36-ae59-77ca3d534006-logs\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.416315 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-httpd-config\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.416452 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-combined-ca-bundle\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.417804 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/777e7a72-1307-4f36-ae59-77ca3d534006-etc-machine-id\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.421027 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777e7a72-1307-4f36-ae59-77ca3d534006-logs\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.425074 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.438164 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-scripts\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.468199 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.475648 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data-custom\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.476438 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.481140 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j87ft\" (UniqueName: \"kubernetes.io/projected/777e7a72-1307-4f36-ae59-77ca3d534006-kube-api-access-j87ft\") pod \"cinder-api-0\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.520088 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-httpd-config\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.520143 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-combined-ca-bundle\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.520187 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-internal-tls-certs\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.520223 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-public-tls-certs\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.520244 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-242fb\" (UniqueName: \"kubernetes.io/projected/732fcac3-39e1-4937-9a97-f243a37bc41b-kube-api-access-242fb\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.520273 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-config\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.520300 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-ovndb-tls-certs\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.565700 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-ovndb-tls-certs\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.578786 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.587571 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-public-tls-certs\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.603644 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-httpd-config\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.603934 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-config\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.604146 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-internal-tls-certs\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.608182 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-242fb\" (UniqueName: \"kubernetes.io/projected/732fcac3-39e1-4937-9a97-f243a37bc41b-kube-api-access-242fb\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.610368 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732fcac3-39e1-4937-9a97-f243a37bc41b-combined-ca-bundle\") pod \"neutron-65d8658977-kc9xn\" (UID: \"732fcac3-39e1-4937-9a97-f243a37bc41b\") " pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.623451 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.629779 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-swift-storage-0\") pod \"25ce1253-6c61-4260-ae89-b2ec88a78b73\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.629930 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrshh\" (UniqueName: \"kubernetes.io/projected/25ce1253-6c61-4260-ae89-b2ec88a78b73-kube-api-access-rrshh\") pod \"25ce1253-6c61-4260-ae89-b2ec88a78b73\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.630003 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-svc\") pod \"25ce1253-6c61-4260-ae89-b2ec88a78b73\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.630093 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-nb\") pod \"25ce1253-6c61-4260-ae89-b2ec88a78b73\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.630146 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-sb\") pod \"25ce1253-6c61-4260-ae89-b2ec88a78b73\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.630202 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-config\") pod \"25ce1253-6c61-4260-ae89-b2ec88a78b73\" (UID: \"25ce1253-6c61-4260-ae89-b2ec88a78b73\") " Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.630714 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25ce1253-6c61-4260-ae89-b2ec88a78b73" (UID: "25ce1253-6c61-4260-ae89-b2ec88a78b73"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.630974 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-config" (OuterVolumeSpecName: "config") pod "25ce1253-6c61-4260-ae89-b2ec88a78b73" (UID: "25ce1253-6c61-4260-ae89-b2ec88a78b73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.631032 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25ce1253-6c61-4260-ae89-b2ec88a78b73" (UID: "25ce1253-6c61-4260-ae89-b2ec88a78b73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.631848 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25ce1253-6c61-4260-ae89-b2ec88a78b73" (UID: "25ce1253-6c61-4260-ae89-b2ec88a78b73"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.638732 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ce1253-6c61-4260-ae89-b2ec88a78b73-kube-api-access-rrshh" (OuterVolumeSpecName: "kube-api-access-rrshh") pod "25ce1253-6c61-4260-ae89-b2ec88a78b73" (UID: "25ce1253-6c61-4260-ae89-b2ec88a78b73"). InnerVolumeSpecName "kube-api-access-rrshh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.649008 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25ce1253-6c61-4260-ae89-b2ec88a78b73" (UID: "25ce1253-6c61-4260-ae89-b2ec88a78b73"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.651887 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.843019 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.843050 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.843061 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrshh\" (UniqueName: \"kubernetes.io/projected/25ce1253-6c61-4260-ae89-b2ec88a78b73-kube-api-access-rrshh\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.843070 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.843086 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:58 crc kubenswrapper[4585]: I0215 17:22:58.843097 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ce1253-6c61-4260-ae89-b2ec88a78b73-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.124419 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.302868 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-sdw5c" Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.302950 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32f9940b-28db-47ee-9a5b-771a1f757af5","Type":"ContainerStarted","Data":"aea6a8f201305e68089098f79b480b277939a47371627dd0c2345f717990d4c5"} Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.323814 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-765946969b-cdgqp"] Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.431751 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-sdw5c"] Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.447988 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-774db89647-sdw5c"] Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.642155 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-666c6c699f-9zzq7"] Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.694977 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7gtqn"] Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.763670 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7b7cd54d97-kklm5" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.190:9696/\": read tcp 10.217.0.2:38112->10.217.0.190:9696: read: connection reset by peer" Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.857330 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66d8b59fb-fq9sk"] Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.922789 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7b7cd54d97-kklm5" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.190:9696/\": dial tcp 10.217.0.190:9696: connect: connection refused" Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.931976 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.184:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.184:8443: connect: connection refused" Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.932030 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.932717 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"e6a81c4c256ad7683acf59828ce484c3e06e042226f140242575aec7b2779784"} pod="openstack/horizon-7b9f5444b-8n6qh" containerMessage="Container horizon failed startup probe, will be restarted" Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.932740 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" containerID="cri-o://e6a81c4c256ad7683acf59828ce484c3e06e042226f140242575aec7b2779784" gracePeriod=30 Feb 15 17:22:59 crc kubenswrapper[4585]: I0215 17:22:59.979105 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.177139 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.185:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.185:8443: connect: connection refused" Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.177408 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.178156 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"03cde210d7e2baae60bc76453feecb1542812009c51804c394167be427f59f34"} pod="openstack/horizon-5fb7dd448-vc5x5" containerMessage="Container horizon failed startup probe, will be restarted" Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.178257 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" containerID="cri-o://03cde210d7e2baae60bc76453feecb1542812009c51804c394167be427f59f34" gracePeriod=30 Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.178997 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65d8658977-kc9xn"] Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.343569 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65d8658977-kc9xn" event={"ID":"732fcac3-39e1-4937-9a97-f243a37bc41b","Type":"ContainerStarted","Data":"a6fa24854b7d504ac9cc774920ebaa37a5783f1d5bfc228b747b7ff618e9102c"} Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.353108 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-765946969b-cdgqp" event={"ID":"42089924-7f2c-40bd-a930-74f9ae10b784","Type":"ContainerStarted","Data":"ee58700e10313549e5a538cf6d37b0255fa27a46db6b54fadb805824a87bfb89"} Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.363959 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"777e7a72-1307-4f36-ae59-77ca3d534006","Type":"ContainerStarted","Data":"160e87acfac7214079ec9c7c7cd44347fc395311dd64309b2a2e66b6789054e5"} Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.374697 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66d8b59fb-fq9sk" event={"ID":"d7374d11-1b9c-4fe9-bc58-82cdd265f67f","Type":"ContainerStarted","Data":"769bdc99f1900744c8398ae49672c7cdbb9f8be05d82d89ca713d3916f3e1ab2"} Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.395843 4585 generic.go:334] "Generic (PLEG): container finished" podID="a588849b-011b-4d05-90d1-e5a41644a556" containerID="b093695f5f74e82abd78c320eedc1a1232383d4410243623b6ec957ca33df879" exitCode=0 Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.395951 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cd54d97-kklm5" event={"ID":"a588849b-011b-4d05-90d1-e5a41644a556","Type":"ContainerDied","Data":"b093695f5f74e82abd78c320eedc1a1232383d4410243623b6ec957ca33df879"} Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.428159 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-666c6c699f-9zzq7" event={"ID":"8e924d6a-e504-41b5-8268-f6df32a3e507","Type":"ContainerStarted","Data":"75b86dd6c211deaeaf6ab33492985ebeea5ee1adae4bcc5efe619d89c620a327"} Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.432475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" event={"ID":"23c49d13-f495-4583-8132-00a2af47b3ef","Type":"ContainerStarted","Data":"d2353f4c1499bfc03de18154f70ecf739efca16402336a7f574970354298626a"} Feb 15 17:23:00 crc kubenswrapper[4585]: I0215 17:23:00.870758 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ce1253-6c61-4260-ae89-b2ec88a78b73" path="/var/lib/kubelet/pods/25ce1253-6c61-4260-ae89-b2ec88a78b73/volumes" Feb 15 17:23:01 crc kubenswrapper[4585]: I0215 17:23:01.191189 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 15 17:23:01 crc kubenswrapper[4585]: I0215 17:23:01.477035 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66d8b59fb-fq9sk" event={"ID":"d7374d11-1b9c-4fe9-bc58-82cdd265f67f","Type":"ContainerStarted","Data":"b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3"} Feb 15 17:23:01 crc kubenswrapper[4585]: I0215 17:23:01.527628 4585 generic.go:334] "Generic (PLEG): container finished" podID="624871de-b62e-4eae-a220-a5d34995919d" containerID="936ac29cae1c80c260b0921007b58089ffbc7ab4b0f219a8fb48f11beef758d1" exitCode=0 Feb 15 17:23:01 crc kubenswrapper[4585]: I0215 17:23:01.527744 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerDied","Data":"936ac29cae1c80c260b0921007b58089ffbc7ab4b0f219a8fb48f11beef758d1"} Feb 15 17:23:01 crc kubenswrapper[4585]: I0215 17:23:01.560858 4585 generic.go:334] "Generic (PLEG): container finished" podID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerID="a7f96a78f761b3e91def9fd79490ece76d3ce60b3832708e70d799fbf4333e2b" exitCode=137 Feb 15 17:23:01 crc kubenswrapper[4585]: I0215 17:23:01.561128 4585 generic.go:334] "Generic (PLEG): container finished" podID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerID="837e67512bff9956662e845ee1325e2cfbdafd3a1bdf16b2ccd38299a5676737" exitCode=137 Feb 15 17:23:01 crc kubenswrapper[4585]: I0215 17:23:01.561110 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c98d54745-hg9jf" event={"ID":"0ea48db7-287e-47f9-9a08-5b2f153fa269","Type":"ContainerDied","Data":"a7f96a78f761b3e91def9fd79490ece76d3ce60b3832708e70d799fbf4333e2b"} Feb 15 17:23:01 crc kubenswrapper[4585]: I0215 17:23:01.561290 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c98d54745-hg9jf" event={"ID":"0ea48db7-287e-47f9-9a08-5b2f153fa269","Type":"ContainerDied","Data":"837e67512bff9956662e845ee1325e2cfbdafd3a1bdf16b2ccd38299a5676737"} Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.107941 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.123200 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.154420 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-scripts\") pod \"0ea48db7-287e-47f9-9a08-5b2f153fa269\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.154472 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-config-data\") pod \"624871de-b62e-4eae-a220-a5d34995919d\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.154545 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-run-httpd\") pod \"624871de-b62e-4eae-a220-a5d34995919d\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.154577 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-scripts\") pod \"624871de-b62e-4eae-a220-a5d34995919d\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.154666 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ea48db7-287e-47f9-9a08-5b2f153fa269-logs\") pod \"0ea48db7-287e-47f9-9a08-5b2f153fa269\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.154936 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-sg-core-conf-yaml\") pod \"624871de-b62e-4eae-a220-a5d34995919d\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.154956 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ea48db7-287e-47f9-9a08-5b2f153fa269-horizon-secret-key\") pod \"0ea48db7-287e-47f9-9a08-5b2f153fa269\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.155013 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-combined-ca-bundle\") pod \"624871de-b62e-4eae-a220-a5d34995919d\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.155049 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-897tw\" (UniqueName: \"kubernetes.io/projected/624871de-b62e-4eae-a220-a5d34995919d-kube-api-access-897tw\") pod \"624871de-b62e-4eae-a220-a5d34995919d\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.155078 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-config-data\") pod \"0ea48db7-287e-47f9-9a08-5b2f153fa269\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.155094 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfvdh\" (UniqueName: \"kubernetes.io/projected/0ea48db7-287e-47f9-9a08-5b2f153fa269-kube-api-access-pfvdh\") pod \"0ea48db7-287e-47f9-9a08-5b2f153fa269\" (UID: \"0ea48db7-287e-47f9-9a08-5b2f153fa269\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.155143 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-log-httpd\") pod \"624871de-b62e-4eae-a220-a5d34995919d\" (UID: \"624871de-b62e-4eae-a220-a5d34995919d\") " Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.157440 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "624871de-b62e-4eae-a220-a5d34995919d" (UID: "624871de-b62e-4eae-a220-a5d34995919d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.157825 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "624871de-b62e-4eae-a220-a5d34995919d" (UID: "624871de-b62e-4eae-a220-a5d34995919d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.158034 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea48db7-287e-47f9-9a08-5b2f153fa269-logs" (OuterVolumeSpecName: "logs") pod "0ea48db7-287e-47f9-9a08-5b2f153fa269" (UID: "0ea48db7-287e-47f9-9a08-5b2f153fa269"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.200206 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624871de-b62e-4eae-a220-a5d34995919d-kube-api-access-897tw" (OuterVolumeSpecName: "kube-api-access-897tw") pod "624871de-b62e-4eae-a220-a5d34995919d" (UID: "624871de-b62e-4eae-a220-a5d34995919d"). InnerVolumeSpecName "kube-api-access-897tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.204970 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-scripts" (OuterVolumeSpecName: "scripts") pod "624871de-b62e-4eae-a220-a5d34995919d" (UID: "624871de-b62e-4eae-a220-a5d34995919d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.231837 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea48db7-287e-47f9-9a08-5b2f153fa269-kube-api-access-pfvdh" (OuterVolumeSpecName: "kube-api-access-pfvdh") pod "0ea48db7-287e-47f9-9a08-5b2f153fa269" (UID: "0ea48db7-287e-47f9-9a08-5b2f153fa269"). InnerVolumeSpecName "kube-api-access-pfvdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.232971 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea48db7-287e-47f9-9a08-5b2f153fa269-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0ea48db7-287e-47f9-9a08-5b2f153fa269" (UID: "0ea48db7-287e-47f9-9a08-5b2f153fa269"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.283851 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.283874 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624871de-b62e-4eae-a220-a5d34995919d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.283882 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.283891 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ea48db7-287e-47f9-9a08-5b2f153fa269-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.283899 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ea48db7-287e-47f9-9a08-5b2f153fa269-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.283908 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-897tw\" (UniqueName: \"kubernetes.io/projected/624871de-b62e-4eae-a220-a5d34995919d-kube-api-access-897tw\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.283918 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfvdh\" (UniqueName: \"kubernetes.io/projected/0ea48db7-287e-47f9-9a08-5b2f153fa269-kube-api-access-pfvdh\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.536492 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-config-data" (OuterVolumeSpecName: "config-data") pod "0ea48db7-287e-47f9-9a08-5b2f153fa269" (UID: "0ea48db7-287e-47f9-9a08-5b2f153fa269"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.560940 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "624871de-b62e-4eae-a220-a5d34995919d" (UID: "624871de-b62e-4eae-a220-a5d34995919d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.575217 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"777e7a72-1307-4f36-ae59-77ca3d534006","Type":"ContainerStarted","Data":"282499d815d7b33bb63ed79d14dc7d8c44846b2631dcd8d50a16a5d1829b01ba"} Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.589793 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.590537 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624871de-b62e-4eae-a220-a5d34995919d","Type":"ContainerDied","Data":"ce509fcbd23f9447680f803e5210fb11b059a69f3bcc7517d301d203ded68af2"} Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.590570 4585 scope.go:117] "RemoveContainer" containerID="61d5708c237318de4c5b54253fe38a8a6e5ad9f6ad2ff99f79ae888bcd7dfafb" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.591933 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-scripts" (OuterVolumeSpecName: "scripts") pod "0ea48db7-287e-47f9-9a08-5b2f153fa269" (UID: "0ea48db7-287e-47f9-9a08-5b2f153fa269"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.595321 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.595339 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.595350 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ea48db7-287e-47f9-9a08-5b2f153fa269-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.601727 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c98d54745-hg9jf" event={"ID":"0ea48db7-287e-47f9-9a08-5b2f153fa269","Type":"ContainerDied","Data":"9ec53a0362f05adb5488bd2e8139df4b4c67b39d8a1e1193eef0bf547ddc6f2d"} Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.601794 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c98d54745-hg9jf" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.617474 4585 generic.go:334] "Generic (PLEG): container finished" podID="23c49d13-f495-4583-8132-00a2af47b3ef" containerID="86eefa8cacc2f6c04f1346bdaeab6d4bff6962a3749ddccd09270a14cbe92abf" exitCode=0 Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.617527 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" event={"ID":"23c49d13-f495-4583-8132-00a2af47b3ef","Type":"ContainerDied","Data":"86eefa8cacc2f6c04f1346bdaeab6d4bff6962a3749ddccd09270a14cbe92abf"} Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.628786 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65d8658977-kc9xn" event={"ID":"732fcac3-39e1-4937-9a97-f243a37bc41b","Type":"ContainerStarted","Data":"8ec96bfd95439ab359aeb864a355fb176b4cf4957c1e8cb69b73ef6a585a4a6f"} Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.684554 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "624871de-b62e-4eae-a220-a5d34995919d" (UID: "624871de-b62e-4eae-a220-a5d34995919d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.691980 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c98d54745-hg9jf"] Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.697764 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.705446 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c98d54745-hg9jf"] Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.727673 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-config-data" (OuterVolumeSpecName: "config-data") pod "624871de-b62e-4eae-a220-a5d34995919d" (UID: "624871de-b62e-4eae-a220-a5d34995919d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.798980 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624871de-b62e-4eae-a220-a5d34995919d-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.854387 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea48db7-287e-47f9-9a08-5b2f153fa269" path="/var/lib/kubelet/pods/0ea48db7-287e-47f9-9a08-5b2f153fa269/volumes" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.920782 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.930786 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.942322 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:02 crc kubenswrapper[4585]: E0215 17:23:02.942728 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="sg-core" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.942746 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="sg-core" Feb 15 17:23:02 crc kubenswrapper[4585]: E0215 17:23:02.942757 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerName="horizon" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.942764 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerName="horizon" Feb 15 17:23:02 crc kubenswrapper[4585]: E0215 17:23:02.942786 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="ceilometer-notification-agent" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.942793 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="ceilometer-notification-agent" Feb 15 17:23:02 crc kubenswrapper[4585]: E0215 17:23:02.942805 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerName="horizon-log" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.942810 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerName="horizon-log" Feb 15 17:23:02 crc kubenswrapper[4585]: E0215 17:23:02.942823 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="proxy-httpd" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.942829 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="proxy-httpd" Feb 15 17:23:02 crc kubenswrapper[4585]: E0215 17:23:02.942846 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="ceilometer-central-agent" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.942852 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="ceilometer-central-agent" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.943047 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerName="horizon-log" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.943067 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="ceilometer-notification-agent" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.943077 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea48db7-287e-47f9-9a08-5b2f153fa269" containerName="horizon" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.943095 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="sg-core" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.943105 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="proxy-httpd" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.943115 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="624871de-b62e-4eae-a220-a5d34995919d" containerName="ceilometer-central-agent" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.945656 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.948611 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.949705 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 15 17:23:02 crc kubenswrapper[4585]: I0215 17:23:02.973692 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.004978 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.005021 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-log-httpd\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.005057 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-run-httpd\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.005126 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-config-data\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.005178 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.005196 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjdv\" (UniqueName: \"kubernetes.io/projected/228e72e9-7294-4722-a38f-a0d8b3ae07bb-kube-api-access-mdjdv\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.005217 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-scripts\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.116886 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-scripts\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.117030 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.117095 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-log-httpd\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.117171 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-run-httpd\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.117352 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-config-data\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.117502 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.117521 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjdv\" (UniqueName: \"kubernetes.io/projected/228e72e9-7294-4722-a38f-a0d8b3ae07bb-kube-api-access-mdjdv\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.118173 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-log-httpd\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.118375 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-run-httpd\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.129284 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-scripts\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.134774 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.138657 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.150557 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjdv\" (UniqueName: \"kubernetes.io/projected/228e72e9-7294-4722-a38f-a0d8b3ae07bb-kube-api-access-mdjdv\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.160098 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-config-data\") pod \"ceilometer-0\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.260812 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.640307 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32f9940b-28db-47ee-9a5b-771a1f757af5","Type":"ContainerStarted","Data":"1303185595dc90a7aa304742be94cbe74a55b8144a18531f1e50e4980afa62c8"} Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.641716 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66d8b59fb-fq9sk" event={"ID":"d7374d11-1b9c-4fe9-bc58-82cdd265f67f","Type":"ContainerStarted","Data":"77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4"} Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.642864 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.642889 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.684791 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-66d8b59fb-fq9sk" podStartSLOduration=6.684777699 podStartE2EDuration="6.684777699s" podCreationTimestamp="2026-02-15 17:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:03.678671642 +0000 UTC m=+1039.622079774" watchObservedRunningTime="2026-02-15 17:23:03.684777699 +0000 UTC m=+1039.628185831" Feb 15 17:23:03 crc kubenswrapper[4585]: I0215 17:23:03.945404 4585 scope.go:117] "RemoveContainer" containerID="933becb359a1a0ce857ab253117d920a4490893087b8993cfd3dae34e95f3490" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.095111 4585 scope.go:117] "RemoveContainer" containerID="936ac29cae1c80c260b0921007b58089ffbc7ab4b0f219a8fb48f11beef758d1" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.244528 4585 scope.go:117] "RemoveContainer" containerID="a7500de9e2f80a57638642b20ce7bb90902d5ee491b2f0946a6106887712c69d" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.510132 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f6d69d-zt7nc"] Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.512117 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.516670 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.516831 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.548540 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f6d69d-zt7nc"] Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.599669 4585 scope.go:117] "RemoveContainer" containerID="a7f96a78f761b3e91def9fd79490ece76d3ce60b3832708e70d799fbf4333e2b" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.697484 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-public-tls-certs\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.698085 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w968w\" (UniqueName: \"kubernetes.io/projected/00bfac9c-5e69-41c0-813a-c163bb169b0d-kube-api-access-w968w\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.698110 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-internal-tls-certs\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.698151 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-combined-ca-bundle\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.698190 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00bfac9c-5e69-41c0-813a-c163bb169b0d-logs\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.698261 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-config-data\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.698275 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-config-data-custom\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.726307 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.740952 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.772944 4585 generic.go:334] "Generic (PLEG): container finished" podID="a588849b-011b-4d05-90d1-e5a41644a556" containerID="bf9a3c887db9c49ddb20f772e5e627ff23b8d3443a8dd3275cf02ae67bd458ec" exitCode=0 Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.773074 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" podStartSLOduration=7.773054728 podStartE2EDuration="7.773054728s" podCreationTimestamp="2026-02-15 17:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:04.755123339 +0000 UTC m=+1040.698531471" watchObservedRunningTime="2026-02-15 17:23:04.773054728 +0000 UTC m=+1040.716462850" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.774186 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cd54d97-kklm5" event={"ID":"a588849b-011b-4d05-90d1-e5a41644a556","Type":"ContainerDied","Data":"bf9a3c887db9c49ddb20f772e5e627ff23b8d3443a8dd3275cf02ae67bd458ec"} Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.793971 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-65d8658977-kc9xn" podStartSLOduration=6.793950057 podStartE2EDuration="6.793950057s" podCreationTimestamp="2026-02-15 17:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:04.778375503 +0000 UTC m=+1040.721783635" watchObservedRunningTime="2026-02-15 17:23:04.793950057 +0000 UTC m=+1040.737358189" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.799566 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w968w\" (UniqueName: \"kubernetes.io/projected/00bfac9c-5e69-41c0-813a-c163bb169b0d-kube-api-access-w968w\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.799631 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-internal-tls-certs\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.799687 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-combined-ca-bundle\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.799731 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00bfac9c-5e69-41c0-813a-c163bb169b0d-logs\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.799801 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-config-data\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.799815 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-config-data-custom\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.799849 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-public-tls-certs\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.802729 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00bfac9c-5e69-41c0-813a-c163bb169b0d-logs\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.807123 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-config-data-custom\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.845143 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-internal-tls-certs\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.846016 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-config-data\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.846129 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-public-tls-certs\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.847874 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bfac9c-5e69-41c0-813a-c163bb169b0d-combined-ca-bundle\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.864112 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w968w\" (UniqueName: \"kubernetes.io/projected/00bfac9c-5e69-41c0-813a-c163bb169b0d-kube-api-access-w968w\") pod \"barbican-api-6f6d69d-zt7nc\" (UID: \"00bfac9c-5e69-41c0-813a-c163bb169b0d\") " pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.927479 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624871de-b62e-4eae-a220-a5d34995919d" path="/var/lib/kubelet/pods/624871de-b62e-4eae-a220-a5d34995919d/volumes" Feb 15 17:23:04 crc kubenswrapper[4585]: I0215 17:23:04.928336 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.160617 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.268327 4585 scope.go:117] "RemoveContainer" containerID="837e67512bff9956662e845ee1325e2cfbdafd3a1bdf16b2ccd38299a5676737" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.557349 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.648885 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-combined-ca-bundle\") pod \"a588849b-011b-4d05-90d1-e5a41644a556\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.648965 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-config\") pod \"a588849b-011b-4d05-90d1-e5a41644a556\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.649028 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-public-tls-certs\") pod \"a588849b-011b-4d05-90d1-e5a41644a556\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.649090 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-ovndb-tls-certs\") pod \"a588849b-011b-4d05-90d1-e5a41644a556\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.649117 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-httpd-config\") pod \"a588849b-011b-4d05-90d1-e5a41644a556\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.649185 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vncz\" (UniqueName: \"kubernetes.io/projected/a588849b-011b-4d05-90d1-e5a41644a556-kube-api-access-6vncz\") pod \"a588849b-011b-4d05-90d1-e5a41644a556\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.649241 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-internal-tls-certs\") pod \"a588849b-011b-4d05-90d1-e5a41644a556\" (UID: \"a588849b-011b-4d05-90d1-e5a41644a556\") " Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.686721 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a588849b-011b-4d05-90d1-e5a41644a556" (UID: "a588849b-011b-4d05-90d1-e5a41644a556"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.707366 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a588849b-011b-4d05-90d1-e5a41644a556-kube-api-access-6vncz" (OuterVolumeSpecName: "kube-api-access-6vncz") pod "a588849b-011b-4d05-90d1-e5a41644a556" (UID: "a588849b-011b-4d05-90d1-e5a41644a556"). InnerVolumeSpecName "kube-api-access-6vncz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.751852 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.751882 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vncz\" (UniqueName: \"kubernetes.io/projected/a588849b-011b-4d05-90d1-e5a41644a556-kube-api-access-6vncz\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.794394 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65d8658977-kc9xn" event={"ID":"732fcac3-39e1-4937-9a97-f243a37bc41b","Type":"ContainerStarted","Data":"6812815f949aad8ec98f84c14b7bc587066661484746ee1536d6b2859ea54cf0"} Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.796044 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerStarted","Data":"53a8f15c47501b3e6e83006f90fc64fc3fde7c6b1253ddc99bbd655218d03271"} Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.809717 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32f9940b-28db-47ee-9a5b-771a1f757af5","Type":"ContainerStarted","Data":"2f03fa3215f5f5c40ec37cfe23f561dc1df63cec3a92a2d65f6aee8f21d8a3e4"} Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.819427 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cd54d97-kklm5" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.820074 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cd54d97-kklm5" event={"ID":"a588849b-011b-4d05-90d1-e5a41644a556","Type":"ContainerDied","Data":"5c1fe7716feeaa7ccd4e30cc4571f877913e667942868c307831d83f5e001e0c"} Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.820446 4585 scope.go:117] "RemoveContainer" containerID="b093695f5f74e82abd78c320eedc1a1232383d4410243623b6ec957ca33df879" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.826402 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-666c6c699f-9zzq7" event={"ID":"8e924d6a-e504-41b5-8268-f6df32a3e507","Type":"ContainerStarted","Data":"2faba1a63ef642ece1d5845fb491fe81350faac6e3dd631373acdbc6f7649c4c"} Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.837064 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.931583439 podStartE2EDuration="8.837048956s" podCreationTimestamp="2026-02-15 17:22:57 +0000 UTC" firstStartedPulling="2026-02-15 17:22:59.130952972 +0000 UTC m=+1035.074361104" lastFinishedPulling="2026-02-15 17:23:00.036418489 +0000 UTC m=+1035.979826621" observedRunningTime="2026-02-15 17:23:05.8265778 +0000 UTC m=+1041.769985932" watchObservedRunningTime="2026-02-15 17:23:05.837048956 +0000 UTC m=+1041.780457088" Feb 15 17:23:05 crc kubenswrapper[4585]: I0215 17:23:05.851550 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" event={"ID":"23c49d13-f495-4583-8132-00a2af47b3ef","Type":"ContainerStarted","Data":"f2b274b38fc6ee93651a962e16084929d27cf97caf323f160d1e8c32684ad320"} Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.010019 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f6d69d-zt7nc"] Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.064575 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a588849b-011b-4d05-90d1-e5a41644a556" (UID: "a588849b-011b-4d05-90d1-e5a41644a556"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.151906 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-config" (OuterVolumeSpecName: "config") pod "a588849b-011b-4d05-90d1-e5a41644a556" (UID: "a588849b-011b-4d05-90d1-e5a41644a556"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.154624 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a588849b-011b-4d05-90d1-e5a41644a556" (UID: "a588849b-011b-4d05-90d1-e5a41644a556"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.160880 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.160915 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.160925 4585 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.205821 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a588849b-011b-4d05-90d1-e5a41644a556" (UID: "a588849b-011b-4d05-90d1-e5a41644a556"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.267585 4585 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.277240 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a588849b-011b-4d05-90d1-e5a41644a556" (UID: "a588849b-011b-4d05-90d1-e5a41644a556"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.360216 4585 scope.go:117] "RemoveContainer" containerID="bf9a3c887db9c49ddb20f772e5e627ff23b8d3443a8dd3275cf02ae67bd458ec" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.369266 4585 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a588849b-011b-4d05-90d1-e5a41644a556-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.460168 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b7cd54d97-kklm5"] Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.468579 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7b7cd54d97-kklm5"] Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.853259 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a588849b-011b-4d05-90d1-e5a41644a556" path="/var/lib/kubelet/pods/a588849b-011b-4d05-90d1-e5a41644a556/volumes" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.879862 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerStarted","Data":"c51b4188a7de9475092079c1adc41e734a477d2a9dbcdb76e5aae1c07408bc42"} Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.902711 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-666c6c699f-9zzq7" event={"ID":"8e924d6a-e504-41b5-8268-f6df32a3e507","Type":"ContainerStarted","Data":"f2f28e1fadb1dee6a057eb19192359c67e77ea6017e8cd906a247aefcb75e75d"} Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.907572 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f6d69d-zt7nc" event={"ID":"00bfac9c-5e69-41c0-813a-c163bb169b0d","Type":"ContainerStarted","Data":"6083e6dd474d0151d0910ad88f0d99c6f127a2cb988cd612f6d5bd450df598ce"} Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.907614 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f6d69d-zt7nc" event={"ID":"00bfac9c-5e69-41c0-813a-c163bb169b0d","Type":"ContainerStarted","Data":"1dd6250ecf96b404af82f5a67e367674dee6129e76df93cebf479766a929f2a6"} Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.907625 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f6d69d-zt7nc" event={"ID":"00bfac9c-5e69-41c0-813a-c163bb169b0d","Type":"ContainerStarted","Data":"bbfa7ac8c281d772575013e7fbb7954ef9faf96053dc617ebbc85b6250b03325"} Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.908489 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.908519 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.912656 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-765946969b-cdgqp" event={"ID":"42089924-7f2c-40bd-a930-74f9ae10b784","Type":"ContainerStarted","Data":"c159af1401001fcc25658b174c23ee23d85f01f46330a068024f6149238b7f1c"} Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.912716 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-765946969b-cdgqp" event={"ID":"42089924-7f2c-40bd-a930-74f9ae10b784","Type":"ContainerStarted","Data":"db08d0a4c65d73187c4c5b9d4922397bfc5df0f5b54cae08e32ce42628d55901"} Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.921661 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"777e7a72-1307-4f36-ae59-77ca3d534006","Type":"ContainerStarted","Data":"234812b9bd2ca339bd36f91e68c63f69f2bf3c1eb4b9b5ac00177f08829312b7"} Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.921929 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="777e7a72-1307-4f36-ae59-77ca3d534006" containerName="cinder-api-log" containerID="cri-o://282499d815d7b33bb63ed79d14dc7d8c44846b2631dcd8d50a16a5d1829b01ba" gracePeriod=30 Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.922038 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="777e7a72-1307-4f36-ae59-77ca3d534006" containerName="cinder-api" containerID="cri-o://234812b9bd2ca339bd36f91e68c63f69f2bf3c1eb4b9b5ac00177f08829312b7" gracePeriod=30 Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.935655 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-666c6c699f-9zzq7" podStartSLOduration=5.487026856 podStartE2EDuration="9.935640396s" podCreationTimestamp="2026-02-15 17:22:57 +0000 UTC" firstStartedPulling="2026-02-15 17:22:59.652833014 +0000 UTC m=+1035.596241146" lastFinishedPulling="2026-02-15 17:23:04.101446554 +0000 UTC m=+1040.044854686" observedRunningTime="2026-02-15 17:23:06.924214515 +0000 UTC m=+1042.867622647" watchObservedRunningTime="2026-02-15 17:23:06.935640396 +0000 UTC m=+1042.879048528" Feb 15 17:23:06 crc kubenswrapper[4585]: I0215 17:23:06.977792 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f6d69d-zt7nc" podStartSLOduration=2.977776885 podStartE2EDuration="2.977776885s" podCreationTimestamp="2026-02-15 17:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:06.964889093 +0000 UTC m=+1042.908297225" watchObservedRunningTime="2026-02-15 17:23:06.977776885 +0000 UTC m=+1042.921185017" Feb 15 17:23:07 crc kubenswrapper[4585]: I0215 17:23:07.020189 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=10.02017595 podStartE2EDuration="10.02017595s" podCreationTimestamp="2026-02-15 17:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:07.015921623 +0000 UTC m=+1042.959329755" watchObservedRunningTime="2026-02-15 17:23:07.02017595 +0000 UTC m=+1042.963584082" Feb 15 17:23:07 crc kubenswrapper[4585]: I0215 17:23:07.062322 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-765946969b-cdgqp" podStartSLOduration=5.303761011 podStartE2EDuration="10.062304478s" podCreationTimestamp="2026-02-15 17:22:57 +0000 UTC" firstStartedPulling="2026-02-15 17:22:59.342922698 +0000 UTC m=+1035.286330830" lastFinishedPulling="2026-02-15 17:23:04.101466165 +0000 UTC m=+1040.044874297" observedRunningTime="2026-02-15 17:23:07.047952637 +0000 UTC m=+1042.991360769" watchObservedRunningTime="2026-02-15 17:23:07.062304478 +0000 UTC m=+1043.005712610" Feb 15 17:23:07 crc kubenswrapper[4585]: I0215 17:23:07.835864 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 15 17:23:07 crc kubenswrapper[4585]: I0215 17:23:07.839743 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.196:8080/\": dial tcp 10.217.0.196:8080: connect: connection refused" Feb 15 17:23:07 crc kubenswrapper[4585]: I0215 17:23:07.949961 4585 generic.go:334] "Generic (PLEG): container finished" podID="777e7a72-1307-4f36-ae59-77ca3d534006" containerID="234812b9bd2ca339bd36f91e68c63f69f2bf3c1eb4b9b5ac00177f08829312b7" exitCode=0 Feb 15 17:23:07 crc kubenswrapper[4585]: I0215 17:23:07.949993 4585 generic.go:334] "Generic (PLEG): container finished" podID="777e7a72-1307-4f36-ae59-77ca3d534006" containerID="282499d815d7b33bb63ed79d14dc7d8c44846b2631dcd8d50a16a5d1829b01ba" exitCode=143 Feb 15 17:23:07 crc kubenswrapper[4585]: I0215 17:23:07.950111 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"777e7a72-1307-4f36-ae59-77ca3d534006","Type":"ContainerDied","Data":"234812b9bd2ca339bd36f91e68c63f69f2bf3c1eb4b9b5ac00177f08829312b7"} Feb 15 17:23:07 crc kubenswrapper[4585]: I0215 17:23:07.950138 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"777e7a72-1307-4f36-ae59-77ca3d534006","Type":"ContainerDied","Data":"282499d815d7b33bb63ed79d14dc7d8c44846b2631dcd8d50a16a5d1829b01ba"} Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.006727 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerStarted","Data":"93d552cd7db76d24983637cd4c14652529196123b7e1febb634f201dcb73da0b"} Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.279245 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.417766 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/777e7a72-1307-4f36-ae59-77ca3d534006-etc-machine-id\") pod \"777e7a72-1307-4f36-ae59-77ca3d534006\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.417819 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data\") pod \"777e7a72-1307-4f36-ae59-77ca3d534006\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.417872 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data-custom\") pod \"777e7a72-1307-4f36-ae59-77ca3d534006\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.417891 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/777e7a72-1307-4f36-ae59-77ca3d534006-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "777e7a72-1307-4f36-ae59-77ca3d534006" (UID: "777e7a72-1307-4f36-ae59-77ca3d534006"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.417913 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j87ft\" (UniqueName: \"kubernetes.io/projected/777e7a72-1307-4f36-ae59-77ca3d534006-kube-api-access-j87ft\") pod \"777e7a72-1307-4f36-ae59-77ca3d534006\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.418070 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777e7a72-1307-4f36-ae59-77ca3d534006-logs\") pod \"777e7a72-1307-4f36-ae59-77ca3d534006\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.418189 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-scripts\") pod \"777e7a72-1307-4f36-ae59-77ca3d534006\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.418229 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-combined-ca-bundle\") pod \"777e7a72-1307-4f36-ae59-77ca3d534006\" (UID: \"777e7a72-1307-4f36-ae59-77ca3d534006\") " Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.418368 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/777e7a72-1307-4f36-ae59-77ca3d534006-logs" (OuterVolumeSpecName: "logs") pod "777e7a72-1307-4f36-ae59-77ca3d534006" (UID: "777e7a72-1307-4f36-ae59-77ca3d534006"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.419337 4585 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/777e7a72-1307-4f36-ae59-77ca3d534006-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.419354 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777e7a72-1307-4f36-ae59-77ca3d534006-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.431954 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "777e7a72-1307-4f36-ae59-77ca3d534006" (UID: "777e7a72-1307-4f36-ae59-77ca3d534006"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.432272 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777e7a72-1307-4f36-ae59-77ca3d534006-kube-api-access-j87ft" (OuterVolumeSpecName: "kube-api-access-j87ft") pod "777e7a72-1307-4f36-ae59-77ca3d534006" (UID: "777e7a72-1307-4f36-ae59-77ca3d534006"). InnerVolumeSpecName "kube-api-access-j87ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.453816 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-scripts" (OuterVolumeSpecName: "scripts") pod "777e7a72-1307-4f36-ae59-77ca3d534006" (UID: "777e7a72-1307-4f36-ae59-77ca3d534006"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.483767 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "777e7a72-1307-4f36-ae59-77ca3d534006" (UID: "777e7a72-1307-4f36-ae59-77ca3d534006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.520907 4585 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.520932 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j87ft\" (UniqueName: \"kubernetes.io/projected/777e7a72-1307-4f36-ae59-77ca3d534006-kube-api-access-j87ft\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.520941 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.520949 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.562696 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data" (OuterVolumeSpecName: "config-data") pod "777e7a72-1307-4f36-ae59-77ca3d534006" (UID: "777e7a72-1307-4f36-ae59-77ca3d534006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:08 crc kubenswrapper[4585]: I0215 17:23:08.622622 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777e7a72-1307-4f36-ae59-77ca3d534006-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.019058 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"777e7a72-1307-4f36-ae59-77ca3d534006","Type":"ContainerDied","Data":"160e87acfac7214079ec9c7c7cd44347fc395311dd64309b2a2e66b6789054e5"} Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.019117 4585 scope.go:117] "RemoveContainer" containerID="234812b9bd2ca339bd36f91e68c63f69f2bf3c1eb4b9b5ac00177f08829312b7" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.019132 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.023504 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerStarted","Data":"7738aae650a3573eea0e7313741be3f0168e818595b80dc9161a05a76e8dcc0e"} Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.051643 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.063336 4585 scope.go:117] "RemoveContainer" containerID="282499d815d7b33bb63ed79d14dc7d8c44846b2631dcd8d50a16a5d1829b01ba" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.077801 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.099847 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 15 17:23:09 crc kubenswrapper[4585]: E0215 17:23:09.100264 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-api" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.100274 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-api" Feb 15 17:23:09 crc kubenswrapper[4585]: E0215 17:23:09.100299 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-httpd" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.100305 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-httpd" Feb 15 17:23:09 crc kubenswrapper[4585]: E0215 17:23:09.100322 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777e7a72-1307-4f36-ae59-77ca3d534006" containerName="cinder-api-log" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.100328 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="777e7a72-1307-4f36-ae59-77ca3d534006" containerName="cinder-api-log" Feb 15 17:23:09 crc kubenswrapper[4585]: E0215 17:23:09.100353 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777e7a72-1307-4f36-ae59-77ca3d534006" containerName="cinder-api" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.100359 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="777e7a72-1307-4f36-ae59-77ca3d534006" containerName="cinder-api" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.100567 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-api" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.100583 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="777e7a72-1307-4f36-ae59-77ca3d534006" containerName="cinder-api-log" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.100590 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="777e7a72-1307-4f36-ae59-77ca3d534006" containerName="cinder-api" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.100615 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a588849b-011b-4d05-90d1-e5a41644a556" containerName="neutron-httpd" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.101659 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.105013 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.105277 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.105398 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.112774 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.231504 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.231626 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8636bbb9-0fc2-4481-b149-bc30884e3819-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.231646 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-scripts\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.231780 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-config-data\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.231880 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd2rc\" (UniqueName: \"kubernetes.io/projected/8636bbb9-0fc2-4481-b149-bc30884e3819-kube-api-access-rd2rc\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.231989 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.232039 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-config-data-custom\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.232097 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.232138 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8636bbb9-0fc2-4481-b149-bc30884e3819-logs\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333438 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8636bbb9-0fc2-4481-b149-bc30884e3819-logs\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333497 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333577 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8636bbb9-0fc2-4481-b149-bc30884e3819-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333610 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-scripts\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333639 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-config-data\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333670 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd2rc\" (UniqueName: \"kubernetes.io/projected/8636bbb9-0fc2-4481-b149-bc30884e3819-kube-api-access-rd2rc\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333714 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333734 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-config-data-custom\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333733 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8636bbb9-0fc2-4481-b149-bc30884e3819-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333761 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.333937 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8636bbb9-0fc2-4481-b149-bc30884e3819-logs\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.345164 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.347131 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.349700 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.350462 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-config-data\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.355013 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-scripts\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.359218 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8636bbb9-0fc2-4481-b149-bc30884e3819-config-data-custom\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.366092 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd2rc\" (UniqueName: \"kubernetes.io/projected/8636bbb9-0fc2-4481-b149-bc30884e3819-kube-api-access-rd2rc\") pod \"cinder-api-0\" (UID: \"8636bbb9-0fc2-4481-b149-bc30884e3819\") " pod="openstack/cinder-api-0" Feb 15 17:23:09 crc kubenswrapper[4585]: I0215 17:23:09.443280 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 15 17:23:10 crc kubenswrapper[4585]: I0215 17:23:10.037715 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerStarted","Data":"236aaa2bad7127083505fcc58260b6c1613d9bc6975bc4ed64f8a7a3252af5aa"} Feb 15 17:23:10 crc kubenswrapper[4585]: I0215 17:23:10.038082 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 15 17:23:10 crc kubenswrapper[4585]: I0215 17:23:10.066420 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.790459136 podStartE2EDuration="8.066403759s" podCreationTimestamp="2026-02-15 17:23:02 +0000 UTC" firstStartedPulling="2026-02-15 17:23:05.300243706 +0000 UTC m=+1041.243651838" lastFinishedPulling="2026-02-15 17:23:09.576188339 +0000 UTC m=+1045.519596461" observedRunningTime="2026-02-15 17:23:10.05430746 +0000 UTC m=+1045.997715592" watchObservedRunningTime="2026-02-15 17:23:10.066403759 +0000 UTC m=+1046.009811891" Feb 15 17:23:10 crc kubenswrapper[4585]: I0215 17:23:10.150769 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 15 17:23:10 crc kubenswrapper[4585]: I0215 17:23:10.865998 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777e7a72-1307-4f36-ae59-77ca3d534006" path="/var/lib/kubelet/pods/777e7a72-1307-4f36-ae59-77ca3d534006/volumes" Feb 15 17:23:10 crc kubenswrapper[4585]: I0215 17:23:10.893821 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66d8b59fb-fq9sk" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.199:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 15 17:23:11 crc kubenswrapper[4585]: I0215 17:23:11.070086 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8636bbb9-0fc2-4481-b149-bc30884e3819","Type":"ContainerStarted","Data":"fba8297f55eb11f32070f0d92fda420b1a444b932a7f78b2f9eefa4712f7fea5"} Feb 15 17:23:11 crc kubenswrapper[4585]: I0215 17:23:11.461098 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-66d8b59fb-fq9sk" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 15 17:23:11 crc kubenswrapper[4585]: I0215 17:23:11.500260 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:23:12 crc kubenswrapper[4585]: I0215 17:23:12.079931 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8636bbb9-0fc2-4481-b149-bc30884e3819","Type":"ContainerStarted","Data":"f4aa65e88e347750ee71f1f4439ee905961343c2b4ab593757a8ae8b5fa4cf9b"} Feb 15 17:23:12 crc kubenswrapper[4585]: I0215 17:23:12.080322 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 15 17:23:12 crc kubenswrapper[4585]: I0215 17:23:12.080335 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8636bbb9-0fc2-4481-b149-bc30884e3819","Type":"ContainerStarted","Data":"7180dba6771b2beacb8e0c2cec26417c9257f537e22b4c2f13dda7150d241599"} Feb 15 17:23:12 crc kubenswrapper[4585]: I0215 17:23:12.116926 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.116906743 podStartE2EDuration="3.116906743s" podCreationTimestamp="2026-02-15 17:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:12.096415715 +0000 UTC m=+1048.039823847" watchObservedRunningTime="2026-02-15 17:23:12.116906743 +0000 UTC m=+1048.060314875" Feb 15 17:23:13 crc kubenswrapper[4585]: I0215 17:23:13.104267 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 15 17:23:13 crc kubenswrapper[4585]: I0215 17:23:13.168483 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 15 17:23:13 crc kubenswrapper[4585]: I0215 17:23:13.427735 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:23:13 crc kubenswrapper[4585]: I0215 17:23:13.500794 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-wc2pz"] Feb 15 17:23:13 crc kubenswrapper[4585]: I0215 17:23:13.501022 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" podUID="6c4de367-d702-449a-b5a5-3db1c142a219" containerName="dnsmasq-dns" containerID="cri-o://2c8e51c2515d518daa5c9360b926fa443780f5165f4d0019088940159115be39" gracePeriod=10 Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.181764 4585 generic.go:334] "Generic (PLEG): container finished" podID="6c4de367-d702-449a-b5a5-3db1c142a219" containerID="2c8e51c2515d518daa5c9360b926fa443780f5165f4d0019088940159115be39" exitCode=0 Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.182288 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerName="cinder-scheduler" containerID="cri-o://1303185595dc90a7aa304742be94cbe74a55b8144a18531f1e50e4980afa62c8" gracePeriod=30 Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.182614 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" event={"ID":"6c4de367-d702-449a-b5a5-3db1c142a219","Type":"ContainerDied","Data":"2c8e51c2515d518daa5c9360b926fa443780f5165f4d0019088940159115be39"} Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.182907 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerName="probe" containerID="cri-o://2f03fa3215f5f5c40ec37cfe23f561dc1df63cec3a92a2d65f6aee8f21d8a3e4" gracePeriod=30 Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.364556 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.410408 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-nb\") pod \"6c4de367-d702-449a-b5a5-3db1c142a219\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.410479 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-swift-storage-0\") pod \"6c4de367-d702-449a-b5a5-3db1c142a219\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.410522 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2zvc\" (UniqueName: \"kubernetes.io/projected/6c4de367-d702-449a-b5a5-3db1c142a219-kube-api-access-v2zvc\") pod \"6c4de367-d702-449a-b5a5-3db1c142a219\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.410576 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-sb\") pod \"6c4de367-d702-449a-b5a5-3db1c142a219\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.410658 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-svc\") pod \"6c4de367-d702-449a-b5a5-3db1c142a219\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.410713 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-config\") pod \"6c4de367-d702-449a-b5a5-3db1c142a219\" (UID: \"6c4de367-d702-449a-b5a5-3db1c142a219\") " Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.440381 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4de367-d702-449a-b5a5-3db1c142a219-kube-api-access-v2zvc" (OuterVolumeSpecName: "kube-api-access-v2zvc") pod "6c4de367-d702-449a-b5a5-3db1c142a219" (UID: "6c4de367-d702-449a-b5a5-3db1c142a219"). InnerVolumeSpecName "kube-api-access-v2zvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.518762 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2zvc\" (UniqueName: \"kubernetes.io/projected/6c4de367-d702-449a-b5a5-3db1c142a219-kube-api-access-v2zvc\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.614171 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c4de367-d702-449a-b5a5-3db1c142a219" (UID: "6c4de367-d702-449a-b5a5-3db1c142a219"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.621899 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.640796 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-config" (OuterVolumeSpecName: "config") pod "6c4de367-d702-449a-b5a5-3db1c142a219" (UID: "6c4de367-d702-449a-b5a5-3db1c142a219"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.643035 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c4de367-d702-449a-b5a5-3db1c142a219" (UID: "6c4de367-d702-449a-b5a5-3db1c142a219"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.662241 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c4de367-d702-449a-b5a5-3db1c142a219" (UID: "6c4de367-d702-449a-b5a5-3db1c142a219"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.679475 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c4de367-d702-449a-b5a5-3db1c142a219" (UID: "6c4de367-d702-449a-b5a5-3db1c142a219"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.723842 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.723870 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.723882 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:14 crc kubenswrapper[4585]: I0215 17:23:14.723890 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4de367-d702-449a-b5a5-3db1c142a219-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:15 crc kubenswrapper[4585]: I0215 17:23:15.192108 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" event={"ID":"6c4de367-d702-449a-b5a5-3db1c142a219","Type":"ContainerDied","Data":"70a10254114eb266dd2c713bc8fa9731240c6c98a6a5227b9459673378692eae"} Feb 15 17:23:15 crc kubenswrapper[4585]: I0215 17:23:15.192898 4585 scope.go:117] "RemoveContainer" containerID="2c8e51c2515d518daa5c9360b926fa443780f5165f4d0019088940159115be39" Feb 15 17:23:15 crc kubenswrapper[4585]: I0215 17:23:15.192213 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-wc2pz" Feb 15 17:23:15 crc kubenswrapper[4585]: I0215 17:23:15.216895 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-wc2pz"] Feb 15 17:23:15 crc kubenswrapper[4585]: I0215 17:23:15.226306 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-wc2pz"] Feb 15 17:23:16 crc kubenswrapper[4585]: I0215 17:23:16.780897 4585 scope.go:117] "RemoveContainer" containerID="2097458816da5299cc074e42360d0c92c543719a2316c4ad53f818567ea9f40d" Feb 15 17:23:16 crc kubenswrapper[4585]: I0215 17:23:16.787079 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:23:16 crc kubenswrapper[4585]: I0215 17:23:16.836326 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-66d8b59fb-fq9sk" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.199:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 15 17:23:17 crc kubenswrapper[4585]: I0215 17:23:17.101868 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4de367-d702-449a-b5a5-3db1c142a219" path="/var/lib/kubelet/pods/6c4de367-d702-449a-b5a5-3db1c142a219/volumes" Feb 15 17:23:17 crc kubenswrapper[4585]: E0215 17:23:17.493234 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32f9940b_28db_47ee_9a5b_771a1f757af5.slice/crio-conmon-2f03fa3215f5f5c40ec37cfe23f561dc1df63cec3a92a2d65f6aee8f21d8a3e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32f9940b_28db_47ee_9a5b_771a1f757af5.slice/crio-2f03fa3215f5f5c40ec37cfe23f561dc1df63cec3a92a2d65f6aee8f21d8a3e4.scope\": RecentStats: unable to find data in memory cache]" Feb 15 17:23:18 crc kubenswrapper[4585]: I0215 17:23:18.117536 4585 generic.go:334] "Generic (PLEG): container finished" podID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerID="2f03fa3215f5f5c40ec37cfe23f561dc1df63cec3a92a2d65f6aee8f21d8a3e4" exitCode=0 Feb 15 17:23:18 crc kubenswrapper[4585]: I0215 17:23:18.117580 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32f9940b-28db-47ee-9a5b-771a1f757af5","Type":"ContainerDied","Data":"2f03fa3215f5f5c40ec37cfe23f561dc1df63cec3a92a2d65f6aee8f21d8a3e4"} Feb 15 17:23:18 crc kubenswrapper[4585]: I0215 17:23:18.421901 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66d8b59fb-fq9sk" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.199:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 15 17:23:19 crc kubenswrapper[4585]: I0215 17:23:19.045143 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:19 crc kubenswrapper[4585]: I0215 17:23:19.069761 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:23:19 crc kubenswrapper[4585]: I0215 17:23:19.235827 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6f6d69d-zt7nc" podUID="00bfac9c-5e69-41c0-813a-c163bb169b0d" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.204:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 15 17:23:19 crc kubenswrapper[4585]: I0215 17:23:19.600398 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:23:19 crc kubenswrapper[4585]: I0215 17:23:19.667436 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f6d69d-zt7nc" Feb 15 17:23:19 crc kubenswrapper[4585]: I0215 17:23:19.732083 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-66d8b59fb-fq9sk"] Feb 15 17:23:19 crc kubenswrapper[4585]: I0215 17:23:19.732256 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-66d8b59fb-fq9sk" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api-log" containerID="cri-o://b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3" gracePeriod=30 Feb 15 17:23:19 crc kubenswrapper[4585]: I0215 17:23:19.732587 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-66d8b59fb-fq9sk" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api" containerID="cri-o://77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4" gracePeriod=30 Feb 15 17:23:19 crc kubenswrapper[4585]: I0215 17:23:19.980500 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5dfc4c95db-jlklr" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.062369 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fd7545564-bhhq2"] Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.062480 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.062600 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-fd7545564-bhhq2" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-log" containerID="cri-o://293806b16be4a9fa335a0560f9a9b284c69a1d4a12284279b8cc229ee00ab421" gracePeriod=30 Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.062732 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-fd7545564-bhhq2" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-api" containerID="cri-o://4b4eec435e566ac9235ee7e885ef157182b30dadaf697753940f4d3947eab502" gracePeriod=30 Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.069848 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-fd7545564-bhhq2" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.193:8778/\": EOF" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.152946 4585 generic.go:334] "Generic (PLEG): container finished" podID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerID="b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3" exitCode=143 Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.153045 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66d8b59fb-fq9sk" event={"ID":"d7374d11-1b9c-4fe9-bc58-82cdd265f67f","Type":"ContainerDied","Data":"b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3"} Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.160967 4585 generic.go:334] "Generic (PLEG): container finished" podID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerID="1303185595dc90a7aa304742be94cbe74a55b8144a18531f1e50e4980afa62c8" exitCode=0 Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.161054 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32f9940b-28db-47ee-9a5b-771a1f757af5","Type":"ContainerDied","Data":"1303185595dc90a7aa304742be94cbe74a55b8144a18531f1e50e4980afa62c8"} Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.668529 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.812888 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data\") pod \"32f9940b-28db-47ee-9a5b-771a1f757af5\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.813057 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f9940b-28db-47ee-9a5b-771a1f757af5-etc-machine-id\") pod \"32f9940b-28db-47ee-9a5b-771a1f757af5\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.813083 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-combined-ca-bundle\") pod \"32f9940b-28db-47ee-9a5b-771a1f757af5\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.813112 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data-custom\") pod \"32f9940b-28db-47ee-9a5b-771a1f757af5\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.813163 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-scripts\") pod \"32f9940b-28db-47ee-9a5b-771a1f757af5\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.813271 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rt48\" (UniqueName: \"kubernetes.io/projected/32f9940b-28db-47ee-9a5b-771a1f757af5-kube-api-access-9rt48\") pod \"32f9940b-28db-47ee-9a5b-771a1f757af5\" (UID: \"32f9940b-28db-47ee-9a5b-771a1f757af5\") " Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.814608 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f9940b-28db-47ee-9a5b-771a1f757af5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "32f9940b-28db-47ee-9a5b-771a1f757af5" (UID: "32f9940b-28db-47ee-9a5b-771a1f757af5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.820309 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f9940b-28db-47ee-9a5b-771a1f757af5-kube-api-access-9rt48" (OuterVolumeSpecName: "kube-api-access-9rt48") pod "32f9940b-28db-47ee-9a5b-771a1f757af5" (UID: "32f9940b-28db-47ee-9a5b-771a1f757af5"). InnerVolumeSpecName "kube-api-access-9rt48". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.830380 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "32f9940b-28db-47ee-9a5b-771a1f757af5" (UID: "32f9940b-28db-47ee-9a5b-771a1f757af5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.831002 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-scripts" (OuterVolumeSpecName: "scripts") pod "32f9940b-28db-47ee-9a5b-771a1f757af5" (UID: "32f9940b-28db-47ee-9a5b-771a1f757af5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.927945 4585 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32f9940b-28db-47ee-9a5b-771a1f757af5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.928317 4585 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.928386 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:20 crc kubenswrapper[4585]: I0215 17:23:20.928443 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rt48\" (UniqueName: \"kubernetes.io/projected/32f9940b-28db-47ee-9a5b-771a1f757af5-kube-api-access-9rt48\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.016723 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32f9940b-28db-47ee-9a5b-771a1f757af5" (UID: "32f9940b-28db-47ee-9a5b-771a1f757af5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.029786 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data" (OuterVolumeSpecName: "config-data") pod "32f9940b-28db-47ee-9a5b-771a1f757af5" (UID: "32f9940b-28db-47ee-9a5b-771a1f757af5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.029852 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.131544 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32f9940b-28db-47ee-9a5b-771a1f757af5-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.170179 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"32f9940b-28db-47ee-9a5b-771a1f757af5","Type":"ContainerDied","Data":"aea6a8f201305e68089098f79b480b277939a47371627dd0c2345f717990d4c5"} Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.170225 4585 scope.go:117] "RemoveContainer" containerID="2f03fa3215f5f5c40ec37cfe23f561dc1df63cec3a92a2d65f6aee8f21d8a3e4" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.170324 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.185675 4585 generic.go:334] "Generic (PLEG): container finished" podID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerID="293806b16be4a9fa335a0560f9a9b284c69a1d4a12284279b8cc229ee00ab421" exitCode=143 Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.185726 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fd7545564-bhhq2" event={"ID":"77bb78e4-1277-494a-a47e-3a2c4ce18228","Type":"ContainerDied","Data":"293806b16be4a9fa335a0560f9a9b284c69a1d4a12284279b8cc229ee00ab421"} Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.197008 4585 scope.go:117] "RemoveContainer" containerID="1303185595dc90a7aa304742be94cbe74a55b8144a18531f1e50e4980afa62c8" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.218685 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.237397 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.251111 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 15 17:23:21 crc kubenswrapper[4585]: E0215 17:23:21.251666 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4de367-d702-449a-b5a5-3db1c142a219" containerName="dnsmasq-dns" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.251684 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4de367-d702-449a-b5a5-3db1c142a219" containerName="dnsmasq-dns" Feb 15 17:23:21 crc kubenswrapper[4585]: E0215 17:23:21.251698 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerName="probe" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.251707 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerName="probe" Feb 15 17:23:21 crc kubenswrapper[4585]: E0215 17:23:21.251730 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4de367-d702-449a-b5a5-3db1c142a219" containerName="init" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.251736 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4de367-d702-449a-b5a5-3db1c142a219" containerName="init" Feb 15 17:23:21 crc kubenswrapper[4585]: E0215 17:23:21.251770 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerName="cinder-scheduler" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.251776 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerName="cinder-scheduler" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.251984 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerName="probe" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.252006 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" containerName="cinder-scheduler" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.252021 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4de367-d702-449a-b5a5-3db1c142a219" containerName="dnsmasq-dns" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.253104 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.258107 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.293687 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.336287 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.336328 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.336348 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.336431 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdtm5\" (UniqueName: \"kubernetes.io/projected/44c6f599-40fd-4592-9275-f1158b3126b0-kube-api-access-qdtm5\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.336450 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44c6f599-40fd-4592-9275-f1158b3126b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.336477 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.464153 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.464197 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.464220 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.464308 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdtm5\" (UniqueName: \"kubernetes.io/projected/44c6f599-40fd-4592-9275-f1158b3126b0-kube-api-access-qdtm5\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.464330 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44c6f599-40fd-4592-9275-f1158b3126b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.464360 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.465889 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44c6f599-40fd-4592-9275-f1158b3126b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.473528 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.474653 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.476424 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.484126 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44c6f599-40fd-4592-9275-f1158b3126b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.500106 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdtm5\" (UniqueName: \"kubernetes.io/projected/44c6f599-40fd-4592-9275-f1158b3126b0-kube-api-access-qdtm5\") pod \"cinder-scheduler-0\" (UID: \"44c6f599-40fd-4592-9275-f1158b3126b0\") " pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.569681 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.641305 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-756b6f8c5f-5zd85" Feb 15 17:23:21 crc kubenswrapper[4585]: I0215 17:23:21.754330 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="8636bbb9-0fc2-4481-b149-bc30884e3819" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.205:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 15 17:23:22 crc kubenswrapper[4585]: I0215 17:23:22.345742 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 15 17:23:22 crc kubenswrapper[4585]: I0215 17:23:22.863424 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32f9940b-28db-47ee-9a5b-771a1f757af5" path="/var/lib/kubelet/pods/32f9940b-28db-47ee-9a5b-771a1f757af5/volumes" Feb 15 17:23:23 crc kubenswrapper[4585]: I0215 17:23:23.250802 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"44c6f599-40fd-4592-9275-f1158b3126b0","Type":"ContainerStarted","Data":"3c53168f202c8ae897b22f26649ba7d8464688ac6d569cfd9ccba950e7c0dea0"} Feb 15 17:23:23 crc kubenswrapper[4585]: I0215 17:23:23.251073 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"44c6f599-40fd-4592-9275-f1158b3126b0","Type":"ContainerStarted","Data":"d3684da83dccbe1b68bb1ea3b286bb7768d78485892fa9a482ef31eb71de954f"} Feb 15 17:23:23 crc kubenswrapper[4585]: I0215 17:23:23.381103 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66d8b59fb-fq9sk" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.199:9311/healthcheck\": dial tcp 10.217.0.199:9311: connect: connection refused" Feb 15 17:23:23 crc kubenswrapper[4585]: I0215 17:23:23.381126 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-66d8b59fb-fq9sk" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.199:9311/healthcheck\": dial tcp 10.217.0.199:9311: connect: connection refused" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.063734 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.161131 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffv7v\" (UniqueName: \"kubernetes.io/projected/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-kube-api-access-ffv7v\") pod \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.161489 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data\") pod \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.161676 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-combined-ca-bundle\") pod \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.161763 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-logs\") pod \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.161869 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data-custom\") pod \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\" (UID: \"d7374d11-1b9c-4fe9-bc58-82cdd265f67f\") " Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.164026 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-logs" (OuterVolumeSpecName: "logs") pod "d7374d11-1b9c-4fe9-bc58-82cdd265f67f" (UID: "d7374d11-1b9c-4fe9-bc58-82cdd265f67f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.171793 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d7374d11-1b9c-4fe9-bc58-82cdd265f67f" (UID: "d7374d11-1b9c-4fe9-bc58-82cdd265f67f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.171797 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-kube-api-access-ffv7v" (OuterVolumeSpecName: "kube-api-access-ffv7v") pod "d7374d11-1b9c-4fe9-bc58-82cdd265f67f" (UID: "d7374d11-1b9c-4fe9-bc58-82cdd265f67f"). InnerVolumeSpecName "kube-api-access-ffv7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.226863 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7374d11-1b9c-4fe9-bc58-82cdd265f67f" (UID: "d7374d11-1b9c-4fe9-bc58-82cdd265f67f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.238011 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data" (OuterVolumeSpecName: "config-data") pod "d7374d11-1b9c-4fe9-bc58-82cdd265f67f" (UID: "d7374d11-1b9c-4fe9-bc58-82cdd265f67f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.262187 4585 generic.go:334] "Generic (PLEG): container finished" podID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerID="77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4" exitCode=0 Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.262363 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66d8b59fb-fq9sk" event={"ID":"d7374d11-1b9c-4fe9-bc58-82cdd265f67f","Type":"ContainerDied","Data":"77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4"} Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.262447 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66d8b59fb-fq9sk" event={"ID":"d7374d11-1b9c-4fe9-bc58-82cdd265f67f","Type":"ContainerDied","Data":"769bdc99f1900744c8398ae49672c7cdbb9f8be05d82d89ca713d3916f3e1ab2"} Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.262522 4585 scope.go:117] "RemoveContainer" containerID="77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.262703 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66d8b59fb-fq9sk" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.264591 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.264628 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.264639 4585 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.264649 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffv7v\" (UniqueName: \"kubernetes.io/projected/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-kube-api-access-ffv7v\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.264659 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7374d11-1b9c-4fe9-bc58-82cdd265f67f-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.276648 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"44c6f599-40fd-4592-9275-f1158b3126b0","Type":"ContainerStarted","Data":"9fa1e95cef85093ad41de9749d30aca455e4facb1362bc307f5436e140a5d020"} Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.298916 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.298904565 podStartE2EDuration="3.298904565s" podCreationTimestamp="2026-02-15 17:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:24.295854142 +0000 UTC m=+1060.239262274" watchObservedRunningTime="2026-02-15 17:23:24.298904565 +0000 UTC m=+1060.242312697" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.344102 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-66d8b59fb-fq9sk"] Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.365229 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-66d8b59fb-fq9sk"] Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.392669 4585 scope.go:117] "RemoveContainer" containerID="b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.420512 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 15 17:23:24 crc kubenswrapper[4585]: E0215 17:23:24.421136 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api-log" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.421220 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api-log" Feb 15 17:23:24 crc kubenswrapper[4585]: E0215 17:23:24.421298 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.421348 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.421633 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.421708 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" containerName="barbican-api-log" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.422387 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.434731 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.434909 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4bxfn" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.435070 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.458722 4585 scope.go:117] "RemoveContainer" containerID="77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.458777 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="8636bbb9-0fc2-4481-b149-bc30884e3819" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.205:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 15 17:23:24 crc kubenswrapper[4585]: E0215 17:23:24.460412 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4\": container with ID starting with 77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4 not found: ID does not exist" containerID="77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.460447 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4"} err="failed to get container status \"77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4\": rpc error: code = NotFound desc = could not find container \"77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4\": container with ID starting with 77fb97ba06d03a62c59a44e7d63ad188a706ae45800aa7a46024abe9c957f3d4 not found: ID does not exist" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.460467 4585 scope.go:117] "RemoveContainer" containerID="b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.467706 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 15 17:23:24 crc kubenswrapper[4585]: E0215 17:23:24.468370 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3\": container with ID starting with b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3 not found: ID does not exist" containerID="b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.468482 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3"} err="failed to get container status \"b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3\": rpc error: code = NotFound desc = could not find container \"b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3\": container with ID starting with b0d313924a37df08e21fa7680cc2bdc8c006c5a17488c2db5e88e7d2892718b3 not found: ID does not exist" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.469768 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-openstack-config-secret\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.469868 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-openstack-config\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.470071 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.470160 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkglz\" (UniqueName: \"kubernetes.io/projected/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-kube-api-access-fkglz\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.572434 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.572471 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkglz\" (UniqueName: \"kubernetes.io/projected/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-kube-api-access-fkglz\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.572527 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-openstack-config-secret\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.572549 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-openstack-config\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.573747 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-openstack-config\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.579517 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-openstack-config-secret\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.592490 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.612478 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkglz\" (UniqueName: \"kubernetes.io/projected/3682ca9c-f964-4e4b-ba4a-489a96ef3f65-kube-api-access-fkglz\") pod \"openstackclient\" (UID: \"3682ca9c-f964-4e4b-ba4a-489a96ef3f65\") " pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.680230 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-fd7545564-bhhq2" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.193:8778/\": read tcp 10.217.0.2:36308->10.217.0.193:8778: read: connection reset by peer" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.762236 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 15 17:23:24 crc kubenswrapper[4585]: I0215 17:23:24.865837 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7374d11-1b9c-4fe9-bc58-82cdd265f67f" path="/var/lib/kubelet/pods/d7374d11-1b9c-4fe9-bc58-82cdd265f67f/volumes" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.300830 4585 generic.go:334] "Generic (PLEG): container finished" podID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerID="4b4eec435e566ac9235ee7e885ef157182b30dadaf697753940f4d3947eab502" exitCode=0 Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.301124 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fd7545564-bhhq2" event={"ID":"77bb78e4-1277-494a-a47e-3a2c4ce18228","Type":"ContainerDied","Data":"4b4eec435e566ac9235ee7e885ef157182b30dadaf697753940f4d3947eab502"} Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.561101 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.677896 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.796231 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-combined-ca-bundle\") pod \"77bb78e4-1277-494a-a47e-3a2c4ce18228\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.796334 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-scripts\") pod \"77bb78e4-1277-494a-a47e-3a2c4ce18228\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.796402 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnplt\" (UniqueName: \"kubernetes.io/projected/77bb78e4-1277-494a-a47e-3a2c4ce18228-kube-api-access-vnplt\") pod \"77bb78e4-1277-494a-a47e-3a2c4ce18228\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.796559 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-public-tls-certs\") pod \"77bb78e4-1277-494a-a47e-3a2c4ce18228\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.796581 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-config-data\") pod \"77bb78e4-1277-494a-a47e-3a2c4ce18228\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.796625 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bb78e4-1277-494a-a47e-3a2c4ce18228-logs\") pod \"77bb78e4-1277-494a-a47e-3a2c4ce18228\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.796666 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-internal-tls-certs\") pod \"77bb78e4-1277-494a-a47e-3a2c4ce18228\" (UID: \"77bb78e4-1277-494a-a47e-3a2c4ce18228\") " Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.798359 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77bb78e4-1277-494a-a47e-3a2c4ce18228-logs" (OuterVolumeSpecName: "logs") pod "77bb78e4-1277-494a-a47e-3a2c4ce18228" (UID: "77bb78e4-1277-494a-a47e-3a2c4ce18228"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.802670 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bb78e4-1277-494a-a47e-3a2c4ce18228-kube-api-access-vnplt" (OuterVolumeSpecName: "kube-api-access-vnplt") pod "77bb78e4-1277-494a-a47e-3a2c4ce18228" (UID: "77bb78e4-1277-494a-a47e-3a2c4ce18228"). InnerVolumeSpecName "kube-api-access-vnplt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.810764 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-scripts" (OuterVolumeSpecName: "scripts") pod "77bb78e4-1277-494a-a47e-3a2c4ce18228" (UID: "77bb78e4-1277-494a-a47e-3a2c4ce18228"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.882706 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77bb78e4-1277-494a-a47e-3a2c4ce18228" (UID: "77bb78e4-1277-494a-a47e-3a2c4ce18228"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.903498 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77bb78e4-1277-494a-a47e-3a2c4ce18228-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.905237 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.905339 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.905417 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnplt\" (UniqueName: \"kubernetes.io/projected/77bb78e4-1277-494a-a47e-3a2c4ce18228-kube-api-access-vnplt\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.922094 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-config-data" (OuterVolumeSpecName: "config-data") pod "77bb78e4-1277-494a-a47e-3a2c4ce18228" (UID: "77bb78e4-1277-494a-a47e-3a2c4ce18228"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:25 crc kubenswrapper[4585]: I0215 17:23:25.983871 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "77bb78e4-1277-494a-a47e-3a2c4ce18228" (UID: "77bb78e4-1277-494a-a47e-3a2c4ce18228"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.011910 4585 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.011949 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.036009 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "77bb78e4-1277-494a-a47e-3a2c4ce18228" (UID: "77bb78e4-1277-494a-a47e-3a2c4ce18228"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.114086 4585 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77bb78e4-1277-494a-a47e-3a2c4ce18228-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.320641 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3682ca9c-f964-4e4b-ba4a-489a96ef3f65","Type":"ContainerStarted","Data":"63dc632107ae3a79af415d3440ce1c36cdee494dbcf968cde0286e53912c3292"} Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.322960 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fd7545564-bhhq2" event={"ID":"77bb78e4-1277-494a-a47e-3a2c4ce18228","Type":"ContainerDied","Data":"8752aad40d4846c3b5a6510391f9be277b35c2b5abc35b1a9ba4b101b2d9f42e"} Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.323007 4585 scope.go:117] "RemoveContainer" containerID="4b4eec435e566ac9235ee7e885ef157182b30dadaf697753940f4d3947eab502" Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.323105 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fd7545564-bhhq2" Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.360216 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fd7545564-bhhq2"] Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.362796 4585 scope.go:117] "RemoveContainer" containerID="293806b16be4a9fa335a0560f9a9b284c69a1d4a12284279b8cc229ee00ab421" Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.386953 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fd7545564-bhhq2"] Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.570822 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 15 17:23:26 crc kubenswrapper[4585]: I0215 17:23:26.873127 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" path="/var/lib/kubelet/pods/77bb78e4-1277-494a-a47e-3a2c4ce18228/volumes" Feb 15 17:23:27 crc kubenswrapper[4585]: I0215 17:23:27.620037 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 15 17:23:28 crc kubenswrapper[4585]: I0215 17:23:28.671060 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-65d8658977-kc9xn" Feb 15 17:23:28 crc kubenswrapper[4585]: I0215 17:23:28.724774 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-657d9d46dd-264lh"] Feb 15 17:23:28 crc kubenswrapper[4585]: I0215 17:23:28.725016 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-657d9d46dd-264lh" podUID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerName="neutron-api" containerID="cri-o://b4152a59efd5841b1bfca434f1be40874851466c91d0256932fbdb62beef2fa1" gracePeriod=30 Feb 15 17:23:28 crc kubenswrapper[4585]: I0215 17:23:28.725295 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-657d9d46dd-264lh" podUID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerName="neutron-httpd" containerID="cri-o://ac218d79b8c9ccf64945c41c3f474ce122891000dc4674623580d8148f91dba3" gracePeriod=30 Feb 15 17:23:29 crc kubenswrapper[4585]: I0215 17:23:29.364192 4585 generic.go:334] "Generic (PLEG): container finished" podID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerID="ac218d79b8c9ccf64945c41c3f474ce122891000dc4674623580d8148f91dba3" exitCode=0 Feb 15 17:23:29 crc kubenswrapper[4585]: I0215 17:23:29.364470 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-657d9d46dd-264lh" event={"ID":"83f72fb7-0fae-45bd-894b-0b8235e489eb","Type":"ContainerDied","Data":"ac218d79b8c9ccf64945c41c3f474ce122891000dc4674623580d8148f91dba3"} Feb 15 17:23:30 crc kubenswrapper[4585]: I0215 17:23:30.382181 4585 generic.go:334] "Generic (PLEG): container finished" podID="b1bd46e7-0703-49b5-81f2-516568284547" containerID="03cde210d7e2baae60bc76453feecb1542812009c51804c394167be427f59f34" exitCode=137 Feb 15 17:23:30 crc kubenswrapper[4585]: I0215 17:23:30.382234 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb7dd448-vc5x5" event={"ID":"b1bd46e7-0703-49b5-81f2-516568284547","Type":"ContainerDied","Data":"03cde210d7e2baae60bc76453feecb1542812009c51804c394167be427f59f34"} Feb 15 17:23:30 crc kubenswrapper[4585]: I0215 17:23:30.392126 4585 generic.go:334] "Generic (PLEG): container finished" podID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerID="e6a81c4c256ad7683acf59828ce484c3e06e042226f140242575aec7b2779784" exitCode=137 Feb 15 17:23:30 crc kubenswrapper[4585]: I0215 17:23:30.392163 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9f5444b-8n6qh" event={"ID":"f443582a-cc67-48f1-a3e5-9ba6af0fbec5","Type":"ContainerDied","Data":"e6a81c4c256ad7683acf59828ce484c3e06e042226f140242575aec7b2779784"} Feb 15 17:23:30 crc kubenswrapper[4585]: I0215 17:23:30.392187 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9f5444b-8n6qh" event={"ID":"f443582a-cc67-48f1-a3e5-9ba6af0fbec5","Type":"ContainerStarted","Data":"7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f"} Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.407996 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb7dd448-vc5x5" event={"ID":"b1bd46e7-0703-49b5-81f2-516568284547","Type":"ContainerStarted","Data":"d01e34d88d41c2f497ca91c0ab4c87883a6e19c0ac368b221e8e825d565e3a27"} Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.836294 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.928778 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d9f8f5c8c-pfm4d"] Feb 15 17:23:31 crc kubenswrapper[4585]: E0215 17:23:31.929238 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-log" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.929254 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-log" Feb 15 17:23:31 crc kubenswrapper[4585]: E0215 17:23:31.929275 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-api" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.929281 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-api" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.929502 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-log" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.929531 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bb78e4-1277-494a-a47e-3a2c4ce18228" containerName="placement-api" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.930637 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.932788 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.933198 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.934839 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 15 17:23:31 crc kubenswrapper[4585]: I0215 17:23:31.955645 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d9f8f5c8c-pfm4d"] Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.043457 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-public-tls-certs\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.043525 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhrgk\" (UniqueName: \"kubernetes.io/projected/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-kube-api-access-nhrgk\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.043717 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-etc-swift\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.043807 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-log-httpd\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.043899 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-internal-tls-certs\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.043950 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-combined-ca-bundle\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.044004 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-run-httpd\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.044159 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-config-data\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.147757 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-public-tls-certs\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.147816 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhrgk\" (UniqueName: \"kubernetes.io/projected/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-kube-api-access-nhrgk\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.147870 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-etc-swift\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.147902 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-log-httpd\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.147941 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-internal-tls-certs\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.147966 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-combined-ca-bundle\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.147993 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-run-httpd\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.148551 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-log-httpd\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.148648 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-run-httpd\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.148864 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-config-data\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.156884 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-internal-tls-certs\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.161166 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-etc-swift\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.163469 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-config-data\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.164293 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-public-tls-certs\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.172981 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhrgk\" (UniqueName: \"kubernetes.io/projected/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-kube-api-access-nhrgk\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.187776 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/827a8b91-c0e1-4ba9-a90a-e0767e9fb71e-combined-ca-bundle\") pod \"swift-proxy-d9f8f5c8c-pfm4d\" (UID: \"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e\") " pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:32 crc kubenswrapper[4585]: I0215 17:23:32.254792 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:33 crc kubenswrapper[4585]: I0215 17:23:33.271912 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 15 17:23:35 crc kubenswrapper[4585]: I0215 17:23:35.554716 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:35 crc kubenswrapper[4585]: I0215 17:23:35.555153 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="ceilometer-central-agent" containerID="cri-o://c51b4188a7de9475092079c1adc41e734a477d2a9dbcdb76e5aae1c07408bc42" gracePeriod=30 Feb 15 17:23:35 crc kubenswrapper[4585]: I0215 17:23:35.555418 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="proxy-httpd" containerID="cri-o://236aaa2bad7127083505fcc58260b6c1613d9bc6975bc4ed64f8a7a3252af5aa" gracePeriod=30 Feb 15 17:23:35 crc kubenswrapper[4585]: I0215 17:23:35.555478 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="ceilometer-notification-agent" containerID="cri-o://93d552cd7db76d24983637cd4c14652529196123b7e1febb634f201dcb73da0b" gracePeriod=30 Feb 15 17:23:35 crc kubenswrapper[4585]: I0215 17:23:35.555517 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="sg-core" containerID="cri-o://7738aae650a3573eea0e7313741be3f0168e818595b80dc9161a05a76e8dcc0e" gracePeriod=30 Feb 15 17:23:36 crc kubenswrapper[4585]: I0215 17:23:36.472525 4585 generic.go:334] "Generic (PLEG): container finished" podID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerID="236aaa2bad7127083505fcc58260b6c1613d9bc6975bc4ed64f8a7a3252af5aa" exitCode=0 Feb 15 17:23:36 crc kubenswrapper[4585]: I0215 17:23:36.472556 4585 generic.go:334] "Generic (PLEG): container finished" podID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerID="7738aae650a3573eea0e7313741be3f0168e818595b80dc9161a05a76e8dcc0e" exitCode=2 Feb 15 17:23:36 crc kubenswrapper[4585]: I0215 17:23:36.472564 4585 generic.go:334] "Generic (PLEG): container finished" podID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerID="c51b4188a7de9475092079c1adc41e734a477d2a9dbcdb76e5aae1c07408bc42" exitCode=0 Feb 15 17:23:36 crc kubenswrapper[4585]: I0215 17:23:36.472583 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerDied","Data":"236aaa2bad7127083505fcc58260b6c1613d9bc6975bc4ed64f8a7a3252af5aa"} Feb 15 17:23:36 crc kubenswrapper[4585]: I0215 17:23:36.472624 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerDied","Data":"7738aae650a3573eea0e7313741be3f0168e818595b80dc9161a05a76e8dcc0e"} Feb 15 17:23:36 crc kubenswrapper[4585]: I0215 17:23:36.472634 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerDied","Data":"c51b4188a7de9475092079c1adc41e734a477d2a9dbcdb76e5aae1c07408bc42"} Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.445963 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.446520 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerName="glance-log" containerID="cri-o://26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c" gracePeriod=30 Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.446662 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerName="glance-httpd" containerID="cri-o://b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6" gracePeriod=30 Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.515358 4585 generic.go:334] "Generic (PLEG): container finished" podID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerID="93d552cd7db76d24983637cd4c14652529196123b7e1febb634f201dcb73da0b" exitCode=0 Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.515398 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerDied","Data":"93d552cd7db76d24983637cd4c14652529196123b7e1febb634f201dcb73da0b"} Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.854835 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-dbmb7"] Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.856324 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.904845 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dbmb7"] Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.906716 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ec1717-0d15-46bd-bfa9-e00997de9192-operator-scripts\") pod \"nova-api-db-create-dbmb7\" (UID: \"b2ec1717-0d15-46bd-bfa9-e00997de9192\") " pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.906767 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbsfd\" (UniqueName: \"kubernetes.io/projected/b2ec1717-0d15-46bd-bfa9-e00997de9192-kube-api-access-kbsfd\") pod \"nova-api-db-create-dbmb7\" (UID: \"b2ec1717-0d15-46bd-bfa9-e00997de9192\") " pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.931783 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.931829 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.959275 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-f6lj7"] Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.962668 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:39 crc kubenswrapper[4585]: I0215 17:23:39.977717 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f6lj7"] Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.008612 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dgxc\" (UniqueName: \"kubernetes.io/projected/38468fa5-3373-42c6-88a2-3b405081fd2f-kube-api-access-2dgxc\") pod \"nova-cell0-db-create-f6lj7\" (UID: \"38468fa5-3373-42c6-88a2-3b405081fd2f\") " pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.008713 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ec1717-0d15-46bd-bfa9-e00997de9192-operator-scripts\") pod \"nova-api-db-create-dbmb7\" (UID: \"b2ec1717-0d15-46bd-bfa9-e00997de9192\") " pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.008747 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbsfd\" (UniqueName: \"kubernetes.io/projected/b2ec1717-0d15-46bd-bfa9-e00997de9192-kube-api-access-kbsfd\") pod \"nova-api-db-create-dbmb7\" (UID: \"b2ec1717-0d15-46bd-bfa9-e00997de9192\") " pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.008765 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38468fa5-3373-42c6-88a2-3b405081fd2f-operator-scripts\") pod \"nova-cell0-db-create-f6lj7\" (UID: \"38468fa5-3373-42c6-88a2-3b405081fd2f\") " pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.010354 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ec1717-0d15-46bd-bfa9-e00997de9192-operator-scripts\") pod \"nova-api-db-create-dbmb7\" (UID: \"b2ec1717-0d15-46bd-bfa9-e00997de9192\") " pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.040369 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbsfd\" (UniqueName: \"kubernetes.io/projected/b2ec1717-0d15-46bd-bfa9-e00997de9192-kube-api-access-kbsfd\") pod \"nova-api-db-create-dbmb7\" (UID: \"b2ec1717-0d15-46bd-bfa9-e00997de9192\") " pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.082517 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-x6mtt"] Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.083848 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.122845 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dgxc\" (UniqueName: \"kubernetes.io/projected/38468fa5-3373-42c6-88a2-3b405081fd2f-kube-api-access-2dgxc\") pod \"nova-cell0-db-create-f6lj7\" (UID: \"38468fa5-3373-42c6-88a2-3b405081fd2f\") " pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.122959 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38468fa5-3373-42c6-88a2-3b405081fd2f-operator-scripts\") pod \"nova-cell0-db-create-f6lj7\" (UID: \"38468fa5-3373-42c6-88a2-3b405081fd2f\") " pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.123638 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38468fa5-3373-42c6-88a2-3b405081fd2f-operator-scripts\") pod \"nova-cell0-db-create-f6lj7\" (UID: \"38468fa5-3373-42c6-88a2-3b405081fd2f\") " pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.145410 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-x6mtt"] Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.174143 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.178368 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.178544 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.182811 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.185:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.185:8443: connect: connection refused" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.206768 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dgxc\" (UniqueName: \"kubernetes.io/projected/38468fa5-3373-42c6-88a2-3b405081fd2f-kube-api-access-2dgxc\") pod \"nova-cell0-db-create-f6lj7\" (UID: \"38468fa5-3373-42c6-88a2-3b405081fd2f\") " pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.214915 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-854d-account-create-update-hkbvm"] Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.216246 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.224855 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.226773 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9gj7\" (UniqueName: \"kubernetes.io/projected/e4035ab3-dd31-461d-a31c-d4c01cecd67e-kube-api-access-v9gj7\") pod \"nova-cell1-db-create-x6mtt\" (UID: \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\") " pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.227163 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4035ab3-dd31-461d-a31c-d4c01cecd67e-operator-scripts\") pod \"nova-cell1-db-create-x6mtt\" (UID: \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\") " pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.248883 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-854d-account-create-update-hkbvm"] Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.283064 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.330180 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f90b2d-7253-4232-b9bc-ab80a39d2a86-operator-scripts\") pod \"nova-api-854d-account-create-update-hkbvm\" (UID: \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\") " pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.330272 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4035ab3-dd31-461d-a31c-d4c01cecd67e-operator-scripts\") pod \"nova-cell1-db-create-x6mtt\" (UID: \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\") " pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.330300 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfhd7\" (UniqueName: \"kubernetes.io/projected/93f90b2d-7253-4232-b9bc-ab80a39d2a86-kube-api-access-nfhd7\") pod \"nova-api-854d-account-create-update-hkbvm\" (UID: \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\") " pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.330339 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9gj7\" (UniqueName: \"kubernetes.io/projected/e4035ab3-dd31-461d-a31c-d4c01cecd67e-kube-api-access-v9gj7\") pod \"nova-cell1-db-create-x6mtt\" (UID: \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\") " pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.336061 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4035ab3-dd31-461d-a31c-d4c01cecd67e-operator-scripts\") pod \"nova-cell1-db-create-x6mtt\" (UID: \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\") " pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:40 crc kubenswrapper[4585]: E0215 17:23:40.373809 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Feb 15 17:23:40 crc kubenswrapper[4585]: E0215 17:23:40.373965 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n565h5c4h585h546h89h59dhbchb7h566h57fh56dh64bhb5h5cfh5cdh4h6h67bhd8h65bh585h67fh56bh67bh66dh84h66fh564h7fh54h58h6cq,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkglz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(3682ca9c-f964-4e4b-ba4a-489a96ef3f65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.374276 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9gj7\" (UniqueName: \"kubernetes.io/projected/e4035ab3-dd31-461d-a31c-d4c01cecd67e-kube-api-access-v9gj7\") pod \"nova-cell1-db-create-x6mtt\" (UID: \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\") " pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:40 crc kubenswrapper[4585]: E0215 17:23:40.400719 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="3682ca9c-f964-4e4b-ba4a-489a96ef3f65" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.421665 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-332c-account-create-update-pdlp9"] Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.424711 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.430533 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.444505 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f90b2d-7253-4232-b9bc-ab80a39d2a86-operator-scripts\") pod \"nova-api-854d-account-create-update-hkbvm\" (UID: \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\") " pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.444763 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfhd7\" (UniqueName: \"kubernetes.io/projected/93f90b2d-7253-4232-b9bc-ab80a39d2a86-kube-api-access-nfhd7\") pod \"nova-api-854d-account-create-update-hkbvm\" (UID: \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\") " pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.454870 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f90b2d-7253-4232-b9bc-ab80a39d2a86-operator-scripts\") pod \"nova-api-854d-account-create-update-hkbvm\" (UID: \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\") " pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.457640 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.470540 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-332c-account-create-update-pdlp9"] Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.502492 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfhd7\" (UniqueName: \"kubernetes.io/projected/93f90b2d-7253-4232-b9bc-ab80a39d2a86-kube-api-access-nfhd7\") pod \"nova-api-854d-account-create-update-hkbvm\" (UID: \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\") " pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.559735 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562720ec-ae91-4da4-874f-e61327e5b850-operator-scripts\") pod \"nova-cell0-332c-account-create-update-pdlp9\" (UID: \"562720ec-ae91-4da4-874f-e61327e5b850\") " pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.559894 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdbj8\" (UniqueName: \"kubernetes.io/projected/562720ec-ae91-4da4-874f-e61327e5b850-kube-api-access-gdbj8\") pod \"nova-cell0-332c-account-create-update-pdlp9\" (UID: \"562720ec-ae91-4da4-874f-e61327e5b850\") " pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.595561 4585 generic.go:334] "Generic (PLEG): container finished" podID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerID="26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c" exitCode=143 Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.596496 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b434dc6-96c7-4fc0-ba05-a37d48709a08","Type":"ContainerDied","Data":"26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c"} Feb 15 17:23:40 crc kubenswrapper[4585]: E0215 17:23:40.598867 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="3682ca9c-f964-4e4b-ba4a-489a96ef3f65" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.623178 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1712-account-create-update-m7x8g"] Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.624561 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.652874 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.661193 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdbj8\" (UniqueName: \"kubernetes.io/projected/562720ec-ae91-4da4-874f-e61327e5b850-kube-api-access-gdbj8\") pod \"nova-cell0-332c-account-create-update-pdlp9\" (UID: \"562720ec-ae91-4da4-874f-e61327e5b850\") " pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.661328 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562720ec-ae91-4da4-874f-e61327e5b850-operator-scripts\") pod \"nova-cell0-332c-account-create-update-pdlp9\" (UID: \"562720ec-ae91-4da4-874f-e61327e5b850\") " pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.662011 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562720ec-ae91-4da4-874f-e61327e5b850-operator-scripts\") pod \"nova-cell0-332c-account-create-update-pdlp9\" (UID: \"562720ec-ae91-4da4-874f-e61327e5b850\") " pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.682655 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1712-account-create-update-m7x8g"] Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.712848 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdbj8\" (UniqueName: \"kubernetes.io/projected/562720ec-ae91-4da4-874f-e61327e5b850-kube-api-access-gdbj8\") pod \"nova-cell0-332c-account-create-update-pdlp9\" (UID: \"562720ec-ae91-4da4-874f-e61327e5b850\") " pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.714513 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.753139 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.762425 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghtmz\" (UniqueName: \"kubernetes.io/projected/bd10677c-322d-4176-a8ac-85e603cd52c8-kube-api-access-ghtmz\") pod \"nova-cell1-1712-account-create-update-m7x8g\" (UID: \"bd10677c-322d-4176-a8ac-85e603cd52c8\") " pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.762578 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd10677c-322d-4176-a8ac-85e603cd52c8-operator-scripts\") pod \"nova-cell1-1712-account-create-update-m7x8g\" (UID: \"bd10677c-322d-4176-a8ac-85e603cd52c8\") " pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.865020 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghtmz\" (UniqueName: \"kubernetes.io/projected/bd10677c-322d-4176-a8ac-85e603cd52c8-kube-api-access-ghtmz\") pod \"nova-cell1-1712-account-create-update-m7x8g\" (UID: \"bd10677c-322d-4176-a8ac-85e603cd52c8\") " pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.865196 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd10677c-322d-4176-a8ac-85e603cd52c8-operator-scripts\") pod \"nova-cell1-1712-account-create-update-m7x8g\" (UID: \"bd10677c-322d-4176-a8ac-85e603cd52c8\") " pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.865876 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd10677c-322d-4176-a8ac-85e603cd52c8-operator-scripts\") pod \"nova-cell1-1712-account-create-update-m7x8g\" (UID: \"bd10677c-322d-4176-a8ac-85e603cd52c8\") " pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.897785 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghtmz\" (UniqueName: \"kubernetes.io/projected/bd10677c-322d-4176-a8ac-85e603cd52c8-kube-api-access-ghtmz\") pod \"nova-cell1-1712-account-create-update-m7x8g\" (UID: \"bd10677c-322d-4176-a8ac-85e603cd52c8\") " pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:40 crc kubenswrapper[4585]: I0215 17:23:40.962470 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.181666 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.282775 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-config-data\") pod \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.282860 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-combined-ca-bundle\") pod \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.282901 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-run-httpd\") pod \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.282927 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdjdv\" (UniqueName: \"kubernetes.io/projected/228e72e9-7294-4722-a38f-a0d8b3ae07bb-kube-api-access-mdjdv\") pod \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.283027 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-scripts\") pod \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.283050 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-log-httpd\") pod \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.283126 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-sg-core-conf-yaml\") pod \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\" (UID: \"228e72e9-7294-4722-a38f-a0d8b3ae07bb\") " Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.287655 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "228e72e9-7294-4722-a38f-a0d8b3ae07bb" (UID: "228e72e9-7294-4722-a38f-a0d8b3ae07bb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.291938 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "228e72e9-7294-4722-a38f-a0d8b3ae07bb" (UID: "228e72e9-7294-4722-a38f-a0d8b3ae07bb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.310058 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-scripts" (OuterVolumeSpecName: "scripts") pod "228e72e9-7294-4722-a38f-a0d8b3ae07bb" (UID: "228e72e9-7294-4722-a38f-a0d8b3ae07bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.331092 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228e72e9-7294-4722-a38f-a0d8b3ae07bb-kube-api-access-mdjdv" (OuterVolumeSpecName: "kube-api-access-mdjdv") pod "228e72e9-7294-4722-a38f-a0d8b3ae07bb" (UID: "228e72e9-7294-4722-a38f-a0d8b3ae07bb"). InnerVolumeSpecName "kube-api-access-mdjdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.352654 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dbmb7"] Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.381848 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "228e72e9-7294-4722-a38f-a0d8b3ae07bb" (UID: "228e72e9-7294-4722-a38f-a0d8b3ae07bb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.386576 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.386722 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.386793 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.386857 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/228e72e9-7294-4722-a38f-a0d8b3ae07bb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.386921 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdjdv\" (UniqueName: \"kubernetes.io/projected/228e72e9-7294-4722-a38f-a0d8b3ae07bb-kube-api-access-mdjdv\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.451316 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d9f8f5c8c-pfm4d"] Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.548708 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "228e72e9-7294-4722-a38f-a0d8b3ae07bb" (UID: "228e72e9-7294-4722-a38f-a0d8b3ae07bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.567765 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-config-data" (OuterVolumeSpecName: "config-data") pod "228e72e9-7294-4722-a38f-a0d8b3ae07bb" (UID: "228e72e9-7294-4722-a38f-a0d8b3ae07bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.611057 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.611080 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228e72e9-7294-4722-a38f-a0d8b3ae07bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.659130 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"228e72e9-7294-4722-a38f-a0d8b3ae07bb","Type":"ContainerDied","Data":"53a8f15c47501b3e6e83006f90fc64fc3fde7c6b1253ddc99bbd655218d03271"} Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.659176 4585 scope.go:117] "RemoveContainer" containerID="236aaa2bad7127083505fcc58260b6c1613d9bc6975bc4ed64f8a7a3252af5aa" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.659280 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.671054 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" event={"ID":"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e","Type":"ContainerStarted","Data":"e00aeb40e10e63508e71de4ba87a976f4a7e9be92e42f21e618e33245c54949d"} Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.687273 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dbmb7" event={"ID":"b2ec1717-0d15-46bd-bfa9-e00997de9192","Type":"ContainerStarted","Data":"33e4ca6bad73a0ecca2e27f0783e8745fb27f8a9e44e5920f3a45b1957736e23"} Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.717103 4585 scope.go:117] "RemoveContainer" containerID="7738aae650a3573eea0e7313741be3f0168e818595b80dc9161a05a76e8dcc0e" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.754743 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-x6mtt"] Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.811430 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.837818 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.848588 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f6lj7"] Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.857684 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:41 crc kubenswrapper[4585]: E0215 17:23:41.858134 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="ceilometer-central-agent" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.858147 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="ceilometer-central-agent" Feb 15 17:23:41 crc kubenswrapper[4585]: E0215 17:23:41.858163 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="sg-core" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.858169 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="sg-core" Feb 15 17:23:41 crc kubenswrapper[4585]: E0215 17:23:41.858196 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="proxy-httpd" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.858202 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="proxy-httpd" Feb 15 17:23:41 crc kubenswrapper[4585]: E0215 17:23:41.858226 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="ceilometer-notification-agent" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.858232 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="ceilometer-notification-agent" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.858430 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="ceilometer-central-agent" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.858449 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="proxy-httpd" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.858455 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="ceilometer-notification-agent" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.858466 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" containerName="sg-core" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.860527 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.864355 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.864952 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.888238 4585 scope.go:117] "RemoveContainer" containerID="93d552cd7db76d24983637cd4c14652529196123b7e1febb634f201dcb73da0b" Feb 15 17:23:41 crc kubenswrapper[4585]: I0215 17:23:41.894128 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.032640 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-config-data\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.032696 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.032735 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtd2k\" (UniqueName: \"kubernetes.io/projected/24b8f662-ff47-41d2-b351-94bde6050902-kube-api-access-dtd2k\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.032940 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-scripts\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.032989 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-run-httpd\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.033004 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-log-httpd\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.033027 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.095426 4585 scope.go:117] "RemoveContainer" containerID="c51b4188a7de9475092079c1adc41e734a477d2a9dbcdb76e5aae1c07408bc42" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.137408 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-log-httpd\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.138590 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-run-httpd\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.138658 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.150653 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-config-data\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.150748 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.150877 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtd2k\" (UniqueName: \"kubernetes.io/projected/24b8f662-ff47-41d2-b351-94bde6050902-kube-api-access-dtd2k\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.151028 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-scripts\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.151361 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-log-httpd\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.151776 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-run-httpd\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.161041 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-scripts\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.166769 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.167222 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.167880 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-config-data\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.180275 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtd2k\" (UniqueName: \"kubernetes.io/projected/24b8f662-ff47-41d2-b351-94bde6050902-kube-api-access-dtd2k\") pod \"ceilometer-0\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.231036 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.233179 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1712-account-create-update-m7x8g"] Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.274810 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-332c-account-create-update-pdlp9"] Feb 15 17:23:42 crc kubenswrapper[4585]: W0215 17:23:42.302687 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod562720ec_ae91_4da4_874f_e61327e5b850.slice/crio-21b1da20c7df9442d01ad3485cb9cc0aac8ac80f729d2ff7e307918dee0a7c6c WatchSource:0}: Error finding container 21b1da20c7df9442d01ad3485cb9cc0aac8ac80f729d2ff7e307918dee0a7c6c: Status 404 returned error can't find the container with id 21b1da20c7df9442d01ad3485cb9cc0aac8ac80f729d2ff7e307918dee0a7c6c Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.305793 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-854d-account-create-update-hkbvm"] Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.608135 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.613175 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerName="glance-httpd" containerID="cri-o://5b47d60a7a8ba02e8fc7332ebafe7d90aeefdc80c41921e827c3931aa8648964" gracePeriod=30 Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.608585 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerName="glance-log" containerID="cri-o://04b9ac1a42f254f32273d0d1e7d8129d13abdc39ccffb7beeee7e40cc40d5351" gracePeriod=30 Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.704264 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.716939 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" event={"ID":"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e","Type":"ContainerStarted","Data":"ed171e2a7c8d0f6bf76df05de2f478f9dcbe6e6036e0dcbea251897afca17c12"} Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.716977 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" event={"ID":"827a8b91-c0e1-4ba9-a90a-e0767e9fb71e","Type":"ContainerStarted","Data":"7dc7d57c226758bb572e80647b7aa4c80b3a55229f75733e81f38f1b258f5322"} Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.717723 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.717758 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.738386 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f6lj7" event={"ID":"38468fa5-3373-42c6-88a2-3b405081fd2f","Type":"ContainerStarted","Data":"325474520dc7f2932b9ea23f1ea0024b99632916658f1aeabb4af578c10e03b8"} Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.747273 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" podStartSLOduration=11.747253745 podStartE2EDuration="11.747253745s" podCreationTimestamp="2026-02-15 17:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:42.737810518 +0000 UTC m=+1078.681218650" watchObservedRunningTime="2026-02-15 17:23:42.747253745 +0000 UTC m=+1078.690661877" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.761461 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-854d-account-create-update-hkbvm" event={"ID":"93f90b2d-7253-4232-b9bc-ab80a39d2a86","Type":"ContainerStarted","Data":"52ab93f7321f033699af15ea3354f8dd596ec1e3e48ac6672ecfa74fac5b092a"} Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.775891 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-x6mtt" event={"ID":"e4035ab3-dd31-461d-a31c-d4c01cecd67e","Type":"ContainerStarted","Data":"537b10c731ef19edbe2e91529e3e45a4ec77fde4ffafe71c3d85fa967eb1a286"} Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.795488 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-332c-account-create-update-pdlp9" event={"ID":"562720ec-ae91-4da4-874f-e61327e5b850","Type":"ContainerStarted","Data":"21b1da20c7df9442d01ad3485cb9cc0aac8ac80f729d2ff7e307918dee0a7c6c"} Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.796770 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1712-account-create-update-m7x8g" event={"ID":"bd10677c-322d-4176-a8ac-85e603cd52c8","Type":"ContainerStarted","Data":"92861d3946688b7880c22b7934f66dc9664cff0acc79c1498a37268a8e5b28f5"} Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.797965 4585 generic.go:334] "Generic (PLEG): container finished" podID="b2ec1717-0d15-46bd-bfa9-e00997de9192" containerID="6baf9086dabfeb2c227dda15190eb426fb749627268ece14ee5bcfb2c590e49d" exitCode=0 Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.798009 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dbmb7" event={"ID":"b2ec1717-0d15-46bd-bfa9-e00997de9192","Type":"ContainerDied","Data":"6baf9086dabfeb2c227dda15190eb426fb749627268ece14ee5bcfb2c590e49d"} Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.825540 4585 generic.go:334] "Generic (PLEG): container finished" podID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerID="b4152a59efd5841b1bfca434f1be40874851466c91d0256932fbdb62beef2fa1" exitCode=0 Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.825582 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-657d9d46dd-264lh" event={"ID":"83f72fb7-0fae-45bd-894b-0b8235e489eb","Type":"ContainerDied","Data":"b4152a59efd5841b1bfca434f1be40874851466c91d0256932fbdb62beef2fa1"} Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.826672 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.871134 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228e72e9-7294-4722-a38f-a0d8b3ae07bb" path="/var/lib/kubelet/pods/228e72e9-7294-4722-a38f-a0d8b3ae07bb/volumes" Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.998661 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9zmw\" (UniqueName: \"kubernetes.io/projected/83f72fb7-0fae-45bd-894b-0b8235e489eb-kube-api-access-w9zmw\") pod \"83f72fb7-0fae-45bd-894b-0b8235e489eb\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.998725 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-config\") pod \"83f72fb7-0fae-45bd-894b-0b8235e489eb\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.998876 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-combined-ca-bundle\") pod \"83f72fb7-0fae-45bd-894b-0b8235e489eb\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.998967 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-httpd-config\") pod \"83f72fb7-0fae-45bd-894b-0b8235e489eb\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " Feb 15 17:23:42 crc kubenswrapper[4585]: I0215 17:23:42.999008 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-ovndb-tls-certs\") pod \"83f72fb7-0fae-45bd-894b-0b8235e489eb\" (UID: \"83f72fb7-0fae-45bd-894b-0b8235e489eb\") " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.028423 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.148392 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "83f72fb7-0fae-45bd-894b-0b8235e489eb" (UID: "83f72fb7-0fae-45bd-894b-0b8235e489eb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.160047 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f72fb7-0fae-45bd-894b-0b8235e489eb-kube-api-access-w9zmw" (OuterVolumeSpecName: "kube-api-access-w9zmw") pod "83f72fb7-0fae-45bd-894b-0b8235e489eb" (UID: "83f72fb7-0fae-45bd-894b-0b8235e489eb"). InnerVolumeSpecName "kube-api-access-w9zmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.207826 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9zmw\" (UniqueName: \"kubernetes.io/projected/83f72fb7-0fae-45bd-894b-0b8235e489eb-kube-api-access-w9zmw\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.213325 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.309740 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83f72fb7-0fae-45bd-894b-0b8235e489eb" (UID: "83f72fb7-0fae-45bd-894b-0b8235e489eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.312919 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-config" (OuterVolumeSpecName: "config") pod "83f72fb7-0fae-45bd-894b-0b8235e489eb" (UID: "83f72fb7-0fae-45bd-894b-0b8235e489eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.319870 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.319897 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.436816 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "83f72fb7-0fae-45bd-894b-0b8235e489eb" (UID: "83f72fb7-0fae-45bd-894b-0b8235e489eb"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.530934 4585 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83f72fb7-0fae-45bd-894b-0b8235e489eb-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.710539 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.840390 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1712-account-create-update-m7x8g" event={"ID":"bd10677c-322d-4176-a8ac-85e603cd52c8","Type":"ContainerStarted","Data":"78202bf3429568a1bb1342e11f91a0d256b1b67754b3fd19846c01647926b594"} Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.842386 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerStarted","Data":"a3aabb2d2b55b5f121501192bd8bcf6b007bc3480ab5e5f3df2013ea02c2b498"} Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.847319 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.847383 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-httpd-run\") pod \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.847473 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-config-data\") pod \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.847523 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-scripts\") pod \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.847561 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-logs\") pod \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.847580 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnz6d\" (UniqueName: \"kubernetes.io/projected/0b434dc6-96c7-4fc0-ba05-a37d48709a08-kube-api-access-xnz6d\") pod \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.847621 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-public-tls-certs\") pod \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.847699 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-combined-ca-bundle\") pod \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\" (UID: \"0b434dc6-96c7-4fc0-ba05-a37d48709a08\") " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.847795 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0b434dc6-96c7-4fc0-ba05-a37d48709a08" (UID: "0b434dc6-96c7-4fc0-ba05-a37d48709a08"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.848159 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.853534 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-logs" (OuterVolumeSpecName: "logs") pod "0b434dc6-96c7-4fc0-ba05-a37d48709a08" (UID: "0b434dc6-96c7-4fc0-ba05-a37d48709a08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.856842 4585 generic.go:334] "Generic (PLEG): container finished" podID="38468fa5-3373-42c6-88a2-3b405081fd2f" containerID="4d7b1f6153119c17f82d34badc87046a80b47b8aba51ce492d52ac17fd4e77a0" exitCode=0 Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.857005 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f6lj7" event={"ID":"38468fa5-3373-42c6-88a2-3b405081fd2f","Type":"ContainerDied","Data":"4d7b1f6153119c17f82d34badc87046a80b47b8aba51ce492d52ac17fd4e77a0"} Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.862482 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-1712-account-create-update-m7x8g" podStartSLOduration=3.862469638 podStartE2EDuration="3.862469638s" podCreationTimestamp="2026-02-15 17:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:43.857014099 +0000 UTC m=+1079.800422231" watchObservedRunningTime="2026-02-15 17:23:43.862469638 +0000 UTC m=+1079.805877770" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.863949 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-657d9d46dd-264lh" event={"ID":"83f72fb7-0fae-45bd-894b-0b8235e489eb","Type":"ContainerDied","Data":"d7b9e68df219dc66bd23c119a1e4a50731140a4b69becb60eb08176e300f5b0f"} Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.863993 4585 scope.go:117] "RemoveContainer" containerID="ac218d79b8c9ccf64945c41c3f474ce122891000dc4674623580d8148f91dba3" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.864087 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-657d9d46dd-264lh" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.866237 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-scripts" (OuterVolumeSpecName: "scripts") pod "0b434dc6-96c7-4fc0-ba05-a37d48709a08" (UID: "0b434dc6-96c7-4fc0-ba05-a37d48709a08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.876288 4585 generic.go:334] "Generic (PLEG): container finished" podID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerID="b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6" exitCode=0 Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.876346 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b434dc6-96c7-4fc0-ba05-a37d48709a08","Type":"ContainerDied","Data":"b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6"} Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.876369 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0b434dc6-96c7-4fc0-ba05-a37d48709a08","Type":"ContainerDied","Data":"a59ac35c6f75e94cc8c874ae2642e1c3a764c0e1e7c7a6f433f75d2e827f3c9f"} Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.876418 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.878414 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b434dc6-96c7-4fc0-ba05-a37d48709a08-kube-api-access-xnz6d" (OuterVolumeSpecName: "kube-api-access-xnz6d") pod "0b434dc6-96c7-4fc0-ba05-a37d48709a08" (UID: "0b434dc6-96c7-4fc0-ba05-a37d48709a08"). InnerVolumeSpecName "kube-api-access-xnz6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.886350 4585 generic.go:334] "Generic (PLEG): container finished" podID="e4035ab3-dd31-461d-a31c-d4c01cecd67e" containerID="6e8b18e9e15054974ea5fddce62a4420ba7867ba4d3ad643f19a3d1044507198" exitCode=0 Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.886404 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-x6mtt" event={"ID":"e4035ab3-dd31-461d-a31c-d4c01cecd67e","Type":"ContainerDied","Data":"6e8b18e9e15054974ea5fddce62a4420ba7867ba4d3ad643f19a3d1044507198"} Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.888925 4585 generic.go:334] "Generic (PLEG): container finished" podID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerID="04b9ac1a42f254f32273d0d1e7d8129d13abdc39ccffb7beeee7e40cc40d5351" exitCode=143 Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.888983 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74b1e0f-71b3-4fe0-9153-4220719171aa","Type":"ContainerDied","Data":"04b9ac1a42f254f32273d0d1e7d8129d13abdc39ccffb7beeee7e40cc40d5351"} Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.890620 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-332c-account-create-update-pdlp9" event={"ID":"562720ec-ae91-4da4-874f-e61327e5b850","Type":"ContainerStarted","Data":"5e0c0a675f820feae9c964d4feb9b26af060c2f5499275dbd3c24a8680b23a24"} Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.891769 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "0b434dc6-96c7-4fc0-ba05-a37d48709a08" (UID: "0b434dc6-96c7-4fc0-ba05-a37d48709a08"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.915296 4585 scope.go:117] "RemoveContainer" containerID="b4152a59efd5841b1bfca434f1be40874851466c91d0256932fbdb62beef2fa1" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.940015 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-332c-account-create-update-pdlp9" podStartSLOduration=3.939992651 podStartE2EDuration="3.939992651s" podCreationTimestamp="2026-02-15 17:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:43.922958287 +0000 UTC m=+1079.866366419" watchObservedRunningTime="2026-02-15 17:23:43.939992651 +0000 UTC m=+1079.883400783" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.950311 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.950336 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.950345 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b434dc6-96c7-4fc0-ba05-a37d48709a08-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.950353 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnz6d\" (UniqueName: \"kubernetes.io/projected/0b434dc6-96c7-4fc0-ba05-a37d48709a08-kube-api-access-xnz6d\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.955975 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-657d9d46dd-264lh"] Feb 15 17:23:43 crc kubenswrapper[4585]: I0215 17:23:43.964627 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-657d9d46dd-264lh"] Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.045688 4585 scope.go:117] "RemoveContainer" containerID="b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.056889 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.066286 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b434dc6-96c7-4fc0-ba05-a37d48709a08" (UID: "0b434dc6-96c7-4fc0-ba05-a37d48709a08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.069739 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0b434dc6-96c7-4fc0-ba05-a37d48709a08" (UID: "0b434dc6-96c7-4fc0-ba05-a37d48709a08"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.155318 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.155342 4585 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.155352 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.298047 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-config-data" (OuterVolumeSpecName: "config-data") pod "0b434dc6-96c7-4fc0-ba05-a37d48709a08" (UID: "0b434dc6-96c7-4fc0-ba05-a37d48709a08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.360575 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b434dc6-96c7-4fc0-ba05-a37d48709a08-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.395741 4585 scope.go:117] "RemoveContainer" containerID="26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.397637 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.414671 4585 scope.go:117] "RemoveContainer" containerID="b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6" Feb 15 17:23:44 crc kubenswrapper[4585]: E0215 17:23:44.416711 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6\": container with ID starting with b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6 not found: ID does not exist" containerID="b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.416783 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6"} err="failed to get container status \"b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6\": rpc error: code = NotFound desc = could not find container \"b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6\": container with ID starting with b0ee99814b080a1a5c2d5b0b2185122e8e77ae418de0af710fcefb828f9958a6 not found: ID does not exist" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.416808 4585 scope.go:117] "RemoveContainer" containerID="26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c" Feb 15 17:23:44 crc kubenswrapper[4585]: E0215 17:23:44.417543 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c\": container with ID starting with 26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c not found: ID does not exist" containerID="26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.417799 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c"} err="failed to get container status \"26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c\": rpc error: code = NotFound desc = could not find container \"26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c\": container with ID starting with 26548225480c943d183f14e4b2c560bce5bd9a01a696949200f2d55be847c43c not found: ID does not exist" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.543695 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.571205 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ec1717-0d15-46bd-bfa9-e00997de9192-operator-scripts\") pod \"b2ec1717-0d15-46bd-bfa9-e00997de9192\" (UID: \"b2ec1717-0d15-46bd-bfa9-e00997de9192\") " Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.571369 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbsfd\" (UniqueName: \"kubernetes.io/projected/b2ec1717-0d15-46bd-bfa9-e00997de9192-kube-api-access-kbsfd\") pod \"b2ec1717-0d15-46bd-bfa9-e00997de9192\" (UID: \"b2ec1717-0d15-46bd-bfa9-e00997de9192\") " Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.573245 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.573702 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2ec1717-0d15-46bd-bfa9-e00997de9192-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2ec1717-0d15-46bd-bfa9-e00997de9192" (UID: "b2ec1717-0d15-46bd-bfa9-e00997de9192"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.623165 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ec1717-0d15-46bd-bfa9-e00997de9192-kube-api-access-kbsfd" (OuterVolumeSpecName: "kube-api-access-kbsfd") pod "b2ec1717-0d15-46bd-bfa9-e00997de9192" (UID: "b2ec1717-0d15-46bd-bfa9-e00997de9192"). InnerVolumeSpecName "kube-api-access-kbsfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.662337 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:23:44 crc kubenswrapper[4585]: E0215 17:23:44.670588 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerName="neutron-api" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.670825 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerName="neutron-api" Feb 15 17:23:44 crc kubenswrapper[4585]: E0215 17:23:44.670854 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerName="glance-httpd" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.670863 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerName="glance-httpd" Feb 15 17:23:44 crc kubenswrapper[4585]: E0215 17:23:44.673842 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerName="glance-log" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.673864 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerName="glance-log" Feb 15 17:23:44 crc kubenswrapper[4585]: E0215 17:23:44.673902 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerName="neutron-httpd" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.673913 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerName="neutron-httpd" Feb 15 17:23:44 crc kubenswrapper[4585]: E0215 17:23:44.673933 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ec1717-0d15-46bd-bfa9-e00997de9192" containerName="mariadb-database-create" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.673940 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ec1717-0d15-46bd-bfa9-e00997de9192" containerName="mariadb-database-create" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.675024 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ec1717-0d15-46bd-bfa9-e00997de9192" containerName="mariadb-database-create" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.675058 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerName="neutron-httpd" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.675082 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerName="glance-log" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.675116 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f72fb7-0fae-45bd-894b-0b8235e489eb" containerName="neutron-api" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.675130 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" containerName="glance-httpd" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.679851 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ec1717-0d15-46bd-bfa9-e00997de9192-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.679886 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbsfd\" (UniqueName: \"kubernetes.io/projected/b2ec1717-0d15-46bd-bfa9-e00997de9192-kube-api-access-kbsfd\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.680785 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.689152 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.689323 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.689553 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.781691 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.781744 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.781776 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.781815 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.781833 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09414c20-b9cb-44ce-a829-112cc2f307d1-logs\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.781897 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09414c20-b9cb-44ce-a829-112cc2f307d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.781950 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6pwr\" (UniqueName: \"kubernetes.io/projected/09414c20-b9cb-44ce-a829-112cc2f307d1-kube-api-access-g6pwr\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.781975 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.858142 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b434dc6-96c7-4fc0-ba05-a37d48709a08" path="/var/lib/kubelet/pods/0b434dc6-96c7-4fc0-ba05-a37d48709a08/volumes" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.858930 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f72fb7-0fae-45bd-894b-0b8235e489eb" path="/var/lib/kubelet/pods/83f72fb7-0fae-45bd-894b-0b8235e489eb/volumes" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.884670 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6pwr\" (UniqueName: \"kubernetes.io/projected/09414c20-b9cb-44ce-a829-112cc2f307d1-kube-api-access-g6pwr\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.884725 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.884771 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.884805 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.884835 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.884882 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.884902 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09414c20-b9cb-44ce-a829-112cc2f307d1-logs\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.884994 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09414c20-b9cb-44ce-a829-112cc2f307d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.885452 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09414c20-b9cb-44ce-a829-112cc2f307d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.887204 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.887844 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09414c20-b9cb-44ce-a829-112cc2f307d1-logs\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.891746 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.892236 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.898429 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.908072 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.918527 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.920227 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09414c20-b9cb-44ce-a829-112cc2f307d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.931119 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6pwr\" (UniqueName: \"kubernetes.io/projected/09414c20-b9cb-44ce-a829-112cc2f307d1-kube-api-access-g6pwr\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.931972 4585 generic.go:334] "Generic (PLEG): container finished" podID="bd10677c-322d-4176-a8ac-85e603cd52c8" containerID="78202bf3429568a1bb1342e11f91a0d256b1b67754b3fd19846c01647926b594" exitCode=0 Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.932036 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1712-account-create-update-m7x8g" event={"ID":"bd10677c-322d-4176-a8ac-85e603cd52c8","Type":"ContainerDied","Data":"78202bf3429568a1bb1342e11f91a0d256b1b67754b3fd19846c01647926b594"} Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.962731 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerStarted","Data":"901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169"} Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.964882 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dbmb7" event={"ID":"b2ec1717-0d15-46bd-bfa9-e00997de9192","Type":"ContainerDied","Data":"33e4ca6bad73a0ecca2e27f0783e8745fb27f8a9e44e5920f3a45b1957736e23"} Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.964905 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33e4ca6bad73a0ecca2e27f0783e8745fb27f8a9e44e5920f3a45b1957736e23" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.964947 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dbmb7" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.971452 4585 generic.go:334] "Generic (PLEG): container finished" podID="93f90b2d-7253-4232-b9bc-ab80a39d2a86" containerID="a7ccd1dd192c7184c48b840e9aada9850e6ee7d83d8d08c4edf6f0bd519c579f" exitCode=0 Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.971503 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-854d-account-create-update-hkbvm" event={"ID":"93f90b2d-7253-4232-b9bc-ab80a39d2a86","Type":"ContainerDied","Data":"a7ccd1dd192c7184c48b840e9aada9850e6ee7d83d8d08c4edf6f0bd519c579f"} Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.991004 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"09414c20-b9cb-44ce-a829-112cc2f307d1\") " pod="openstack/glance-default-external-api-0" Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.996092 4585 generic.go:334] "Generic (PLEG): container finished" podID="562720ec-ae91-4da4-874f-e61327e5b850" containerID="5e0c0a675f820feae9c964d4feb9b26af060c2f5499275dbd3c24a8680b23a24" exitCode=0 Feb 15 17:23:44 crc kubenswrapper[4585]: I0215 17:23:44.996718 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-332c-account-create-update-pdlp9" event={"ID":"562720ec-ae91-4da4-874f-e61327e5b850","Type":"ContainerDied","Data":"5e0c0a675f820feae9c964d4feb9b26af060c2f5499275dbd3c24a8680b23a24"} Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.027904 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.531558 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.620454 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4035ab3-dd31-461d-a31c-d4c01cecd67e-operator-scripts\") pod \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\" (UID: \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\") " Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.620495 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9gj7\" (UniqueName: \"kubernetes.io/projected/e4035ab3-dd31-461d-a31c-d4c01cecd67e-kube-api-access-v9gj7\") pod \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\" (UID: \"e4035ab3-dd31-461d-a31c-d4c01cecd67e\") " Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.622278 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4035ab3-dd31-461d-a31c-d4c01cecd67e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e4035ab3-dd31-461d-a31c-d4c01cecd67e" (UID: "e4035ab3-dd31-461d-a31c-d4c01cecd67e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.630794 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4035ab3-dd31-461d-a31c-d4c01cecd67e-kube-api-access-v9gj7" (OuterVolumeSpecName: "kube-api-access-v9gj7") pod "e4035ab3-dd31-461d-a31c-d4c01cecd67e" (UID: "e4035ab3-dd31-461d-a31c-d4c01cecd67e"). InnerVolumeSpecName "kube-api-access-v9gj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.664726 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.722881 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4035ab3-dd31-461d-a31c-d4c01cecd67e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.722914 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9gj7\" (UniqueName: \"kubernetes.io/projected/e4035ab3-dd31-461d-a31c-d4c01cecd67e-kube-api-access-v9gj7\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.824477 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38468fa5-3373-42c6-88a2-3b405081fd2f-operator-scripts\") pod \"38468fa5-3373-42c6-88a2-3b405081fd2f\" (UID: \"38468fa5-3373-42c6-88a2-3b405081fd2f\") " Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.824646 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dgxc\" (UniqueName: \"kubernetes.io/projected/38468fa5-3373-42c6-88a2-3b405081fd2f-kube-api-access-2dgxc\") pod \"38468fa5-3373-42c6-88a2-3b405081fd2f\" (UID: \"38468fa5-3373-42c6-88a2-3b405081fd2f\") " Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.824964 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38468fa5-3373-42c6-88a2-3b405081fd2f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38468fa5-3373-42c6-88a2-3b405081fd2f" (UID: "38468fa5-3373-42c6-88a2-3b405081fd2f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.826015 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38468fa5-3373-42c6-88a2-3b405081fd2f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.829990 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38468fa5-3373-42c6-88a2-3b405081fd2f-kube-api-access-2dgxc" (OuterVolumeSpecName: "kube-api-access-2dgxc") pod "38468fa5-3373-42c6-88a2-3b405081fd2f" (UID: "38468fa5-3373-42c6-88a2-3b405081fd2f"). InnerVolumeSpecName "kube-api-access-2dgxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.930081 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dgxc\" (UniqueName: \"kubernetes.io/projected/38468fa5-3373-42c6-88a2-3b405081fd2f-kube-api-access-2dgxc\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:45 crc kubenswrapper[4585]: I0215 17:23:45.985321 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.018938 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-x6mtt" event={"ID":"e4035ab3-dd31-461d-a31c-d4c01cecd67e","Type":"ContainerDied","Data":"537b10c731ef19edbe2e91529e3e45a4ec77fde4ffafe71c3d85fa967eb1a286"} Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.019082 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="537b10c731ef19edbe2e91529e3e45a4ec77fde4ffafe71c3d85fa967eb1a286" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.019148 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-x6mtt" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.035081 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerStarted","Data":"7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978"} Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.046862 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f6lj7" event={"ID":"38468fa5-3373-42c6-88a2-3b405081fd2f","Type":"ContainerDied","Data":"325474520dc7f2932b9ea23f1ea0024b99632916658f1aeabb4af578c10e03b8"} Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.046903 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="325474520dc7f2932b9ea23f1ea0024b99632916658f1aeabb4af578c10e03b8" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.046971 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f6lj7" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.049524 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09414c20-b9cb-44ce-a829-112cc2f307d1","Type":"ContainerStarted","Data":"713b2d46c71930898d9e44316185c13f4b39a011ef776ccd5ba3ddd2da203a56"} Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.629579 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.782352 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdbj8\" (UniqueName: \"kubernetes.io/projected/562720ec-ae91-4da4-874f-e61327e5b850-kube-api-access-gdbj8\") pod \"562720ec-ae91-4da4-874f-e61327e5b850\" (UID: \"562720ec-ae91-4da4-874f-e61327e5b850\") " Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.782523 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562720ec-ae91-4da4-874f-e61327e5b850-operator-scripts\") pod \"562720ec-ae91-4da4-874f-e61327e5b850\" (UID: \"562720ec-ae91-4da4-874f-e61327e5b850\") " Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.797299 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/562720ec-ae91-4da4-874f-e61327e5b850-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "562720ec-ae91-4da4-874f-e61327e5b850" (UID: "562720ec-ae91-4da4-874f-e61327e5b850"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.802626 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/562720ec-ae91-4da4-874f-e61327e5b850-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.819541 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/562720ec-ae91-4da4-874f-e61327e5b850-kube-api-access-gdbj8" (OuterVolumeSpecName: "kube-api-access-gdbj8") pod "562720ec-ae91-4da4-874f-e61327e5b850" (UID: "562720ec-ae91-4da4-874f-e61327e5b850"). InnerVolumeSpecName "kube-api-access-gdbj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.877726 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.904463 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdbj8\" (UniqueName: \"kubernetes.io/projected/562720ec-ae91-4da4-874f-e61327e5b850-kube-api-access-gdbj8\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:46 crc kubenswrapper[4585]: I0215 17:23:46.992272 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.006042 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfhd7\" (UniqueName: \"kubernetes.io/projected/93f90b2d-7253-4232-b9bc-ab80a39d2a86-kube-api-access-nfhd7\") pod \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\" (UID: \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.006129 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f90b2d-7253-4232-b9bc-ab80a39d2a86-operator-scripts\") pod \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\" (UID: \"93f90b2d-7253-4232-b9bc-ab80a39d2a86\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.010754 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f90b2d-7253-4232-b9bc-ab80a39d2a86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93f90b2d-7253-4232-b9bc-ab80a39d2a86" (UID: "93f90b2d-7253-4232-b9bc-ab80a39d2a86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.021727 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.021790 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.021804 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f90b2d-7253-4232-b9bc-ab80a39d2a86-kube-api-access-nfhd7" (OuterVolumeSpecName: "kube-api-access-nfhd7") pod "93f90b2d-7253-4232-b9bc-ab80a39d2a86" (UID: "93f90b2d-7253-4232-b9bc-ab80a39d2a86"). InnerVolumeSpecName "kube-api-access-nfhd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.113877 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghtmz\" (UniqueName: \"kubernetes.io/projected/bd10677c-322d-4176-a8ac-85e603cd52c8-kube-api-access-ghtmz\") pod \"bd10677c-322d-4176-a8ac-85e603cd52c8\" (UID: \"bd10677c-322d-4176-a8ac-85e603cd52c8\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.114028 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd10677c-322d-4176-a8ac-85e603cd52c8-operator-scripts\") pod \"bd10677c-322d-4176-a8ac-85e603cd52c8\" (UID: \"bd10677c-322d-4176-a8ac-85e603cd52c8\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.114432 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfhd7\" (UniqueName: \"kubernetes.io/projected/93f90b2d-7253-4232-b9bc-ab80a39d2a86-kube-api-access-nfhd7\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.114449 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f90b2d-7253-4232-b9bc-ab80a39d2a86-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.122694 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd10677c-322d-4176-a8ac-85e603cd52c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd10677c-322d-4176-a8ac-85e603cd52c8" (UID: "bd10677c-322d-4176-a8ac-85e603cd52c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.172752 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd10677c-322d-4176-a8ac-85e603cd52c8-kube-api-access-ghtmz" (OuterVolumeSpecName: "kube-api-access-ghtmz") pod "bd10677c-322d-4176-a8ac-85e603cd52c8" (UID: "bd10677c-322d-4176-a8ac-85e603cd52c8"). InnerVolumeSpecName "kube-api-access-ghtmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.176041 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1712-account-create-update-m7x8g" event={"ID":"bd10677c-322d-4176-a8ac-85e603cd52c8","Type":"ContainerDied","Data":"92861d3946688b7880c22b7934f66dc9664cff0acc79c1498a37268a8e5b28f5"} Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.176073 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92861d3946688b7880c22b7934f66dc9664cff0acc79c1498a37268a8e5b28f5" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.176142 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1712-account-create-update-m7x8g" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.195164 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerStarted","Data":"1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a"} Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.203444 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-854d-account-create-update-hkbvm" event={"ID":"93f90b2d-7253-4232-b9bc-ab80a39d2a86","Type":"ContainerDied","Data":"52ab93f7321f033699af15ea3354f8dd596ec1e3e48ac6672ecfa74fac5b092a"} Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.203483 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ab93f7321f033699af15ea3354f8dd596ec1e3e48ac6672ecfa74fac5b092a" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.203537 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-854d-account-create-update-hkbvm" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.230923 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd10677c-322d-4176-a8ac-85e603cd52c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.230971 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghtmz\" (UniqueName: \"kubernetes.io/projected/bd10677c-322d-4176-a8ac-85e603cd52c8-kube-api-access-ghtmz\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.264204 4585 generic.go:334] "Generic (PLEG): container finished" podID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerID="5b47d60a7a8ba02e8fc7332ebafe7d90aeefdc80c41921e827c3931aa8648964" exitCode=0 Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.264277 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74b1e0f-71b3-4fe0-9153-4220719171aa","Type":"ContainerDied","Data":"5b47d60a7a8ba02e8fc7332ebafe7d90aeefdc80c41921e827c3931aa8648964"} Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.307908 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-332c-account-create-update-pdlp9" event={"ID":"562720ec-ae91-4da4-874f-e61327e5b850","Type":"ContainerDied","Data":"21b1da20c7df9442d01ad3485cb9cc0aac8ac80f729d2ff7e307918dee0a7c6c"} Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.307946 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21b1da20c7df9442d01ad3485cb9cc0aac8ac80f729d2ff7e307918dee0a7c6c" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.307985 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-332c-account-create-update-pdlp9" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.316456 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.321901 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d9f8f5c8c-pfm4d" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.412837 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.558766 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-httpd-run\") pod \"c74b1e0f-71b3-4fe0-9153-4220719171aa\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.559009 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-combined-ca-bundle\") pod \"c74b1e0f-71b3-4fe0-9153-4220719171aa\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.559054 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-scripts\") pod \"c74b1e0f-71b3-4fe0-9153-4220719171aa\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.559077 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-internal-tls-certs\") pod \"c74b1e0f-71b3-4fe0-9153-4220719171aa\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.559124 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c74b1e0f-71b3-4fe0-9153-4220719171aa\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.559143 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-logs\") pod \"c74b1e0f-71b3-4fe0-9153-4220719171aa\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.559167 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqj72\" (UniqueName: \"kubernetes.io/projected/c74b1e0f-71b3-4fe0-9153-4220719171aa-kube-api-access-fqj72\") pod \"c74b1e0f-71b3-4fe0-9153-4220719171aa\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.559211 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-config-data\") pod \"c74b1e0f-71b3-4fe0-9153-4220719171aa\" (UID: \"c74b1e0f-71b3-4fe0-9153-4220719171aa\") " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.565648 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-logs" (OuterVolumeSpecName: "logs") pod "c74b1e0f-71b3-4fe0-9153-4220719171aa" (UID: "c74b1e0f-71b3-4fe0-9153-4220719171aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.566887 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c74b1e0f-71b3-4fe0-9153-4220719171aa" (UID: "c74b1e0f-71b3-4fe0-9153-4220719171aa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.615745 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74b1e0f-71b3-4fe0-9153-4220719171aa-kube-api-access-fqj72" (OuterVolumeSpecName: "kube-api-access-fqj72") pod "c74b1e0f-71b3-4fe0-9153-4220719171aa" (UID: "c74b1e0f-71b3-4fe0-9153-4220719171aa"). InnerVolumeSpecName "kube-api-access-fqj72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.615812 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "c74b1e0f-71b3-4fe0-9153-4220719171aa" (UID: "c74b1e0f-71b3-4fe0-9153-4220719171aa"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.643774 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-scripts" (OuterVolumeSpecName: "scripts") pod "c74b1e0f-71b3-4fe0-9153-4220719171aa" (UID: "c74b1e0f-71b3-4fe0-9153-4220719171aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.661900 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.661947 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.661959 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.661969 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqj72\" (UniqueName: \"kubernetes.io/projected/c74b1e0f-71b3-4fe0-9153-4220719171aa-kube-api-access-fqj72\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.661982 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c74b1e0f-71b3-4fe0-9153-4220719171aa-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.698030 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.712926 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-config-data" (OuterVolumeSpecName: "config-data") pod "c74b1e0f-71b3-4fe0-9153-4220719171aa" (UID: "c74b1e0f-71b3-4fe0-9153-4220719171aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.716756 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c74b1e0f-71b3-4fe0-9153-4220719171aa" (UID: "c74b1e0f-71b3-4fe0-9153-4220719171aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.763403 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.763689 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.763789 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.775956 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c74b1e0f-71b3-4fe0-9153-4220719171aa" (UID: "c74b1e0f-71b3-4fe0-9153-4220719171aa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:47 crc kubenswrapper[4585]: I0215 17:23:47.865884 4585 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c74b1e0f-71b3-4fe0-9153-4220719171aa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.337197 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c74b1e0f-71b3-4fe0-9153-4220719171aa","Type":"ContainerDied","Data":"337a8c43c4571aa5e0a1f368dc7c7b23776e8f6d81c46bc268d7ee647bd021bb"} Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.337467 4585 scope.go:117] "RemoveContainer" containerID="5b47d60a7a8ba02e8fc7332ebafe7d90aeefdc80c41921e827c3931aa8648964" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.337618 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.353126 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09414c20-b9cb-44ce-a829-112cc2f307d1","Type":"ContainerStarted","Data":"c62572a67051282f235832a6d801f5a49cfa100d996f37c473394632233a94f6"} Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.468166 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.468378 4585 scope.go:117] "RemoveContainer" containerID="04b9ac1a42f254f32273d0d1e7d8129d13abdc39ccffb7beeee7e40cc40d5351" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.561477 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.585565 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:23:48 crc kubenswrapper[4585]: E0215 17:23:48.586790 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd10677c-322d-4176-a8ac-85e603cd52c8" containerName="mariadb-account-create-update" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.586807 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd10677c-322d-4176-a8ac-85e603cd52c8" containerName="mariadb-account-create-update" Feb 15 17:23:48 crc kubenswrapper[4585]: E0215 17:23:48.586834 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38468fa5-3373-42c6-88a2-3b405081fd2f" containerName="mariadb-database-create" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.586841 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="38468fa5-3373-42c6-88a2-3b405081fd2f" containerName="mariadb-database-create" Feb 15 17:23:48 crc kubenswrapper[4585]: E0215 17:23:48.586864 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerName="glance-httpd" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.586870 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerName="glance-httpd" Feb 15 17:23:48 crc kubenswrapper[4585]: E0215 17:23:48.586879 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4035ab3-dd31-461d-a31c-d4c01cecd67e" containerName="mariadb-database-create" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.586885 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4035ab3-dd31-461d-a31c-d4c01cecd67e" containerName="mariadb-database-create" Feb 15 17:23:48 crc kubenswrapper[4585]: E0215 17:23:48.586907 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerName="glance-log" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.586913 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerName="glance-log" Feb 15 17:23:48 crc kubenswrapper[4585]: E0215 17:23:48.586919 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562720ec-ae91-4da4-874f-e61327e5b850" containerName="mariadb-account-create-update" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.586924 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="562720ec-ae91-4da4-874f-e61327e5b850" containerName="mariadb-account-create-update" Feb 15 17:23:48 crc kubenswrapper[4585]: E0215 17:23:48.586936 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f90b2d-7253-4232-b9bc-ab80a39d2a86" containerName="mariadb-account-create-update" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.586943 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f90b2d-7253-4232-b9bc-ab80a39d2a86" containerName="mariadb-account-create-update" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.587141 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerName="glance-httpd" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.587153 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4035ab3-dd31-461d-a31c-d4c01cecd67e" containerName="mariadb-database-create" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.587161 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd10677c-322d-4176-a8ac-85e603cd52c8" containerName="mariadb-account-create-update" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.587176 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="38468fa5-3373-42c6-88a2-3b405081fd2f" containerName="mariadb-database-create" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.587191 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f90b2d-7253-4232-b9bc-ab80a39d2a86" containerName="mariadb-account-create-update" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.587218 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="562720ec-ae91-4da4-874f-e61327e5b850" containerName="mariadb-account-create-update" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.587230 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74b1e0f-71b3-4fe0-9153-4220719171aa" containerName="glance-log" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.592317 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.594651 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.598284 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.607134 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c0fc65d-8f21-4e29-819a-06cb632e02cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.607182 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.607244 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0fc65d-8f21-4e29-819a-06cb632e02cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.607262 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.607290 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.607320 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.607352 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.607385 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6m5l\" (UniqueName: \"kubernetes.io/projected/3c0fc65d-8f21-4e29-819a-06cb632e02cf-kube-api-access-p6m5l\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.625656 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.708518 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.708562 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6m5l\" (UniqueName: \"kubernetes.io/projected/3c0fc65d-8f21-4e29-819a-06cb632e02cf-kube-api-access-p6m5l\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.708643 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c0fc65d-8f21-4e29-819a-06cb632e02cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.708673 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.708735 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0fc65d-8f21-4e29-819a-06cb632e02cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.708751 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.708785 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.708816 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.708936 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.709274 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c0fc65d-8f21-4e29-819a-06cb632e02cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.709695 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0fc65d-8f21-4e29-819a-06cb632e02cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.725404 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.725828 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.729306 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.735527 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0fc65d-8f21-4e29-819a-06cb632e02cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.751235 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6m5l\" (UniqueName: \"kubernetes.io/projected/3c0fc65d-8f21-4e29-819a-06cb632e02cf-kube-api-access-p6m5l\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.845389 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"3c0fc65d-8f21-4e29-819a-06cb632e02cf\") " pod="openstack/glance-default-internal-api-0" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.865574 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74b1e0f-71b3-4fe0-9153-4220719171aa" path="/var/lib/kubelet/pods/c74b1e0f-71b3-4fe0-9153-4220719171aa/volumes" Feb 15 17:23:48 crc kubenswrapper[4585]: I0215 17:23:48.913241 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.392223 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"09414c20-b9cb-44ce-a829-112cc2f307d1","Type":"ContainerStarted","Data":"3e5b0b665a4d7a08a6bd310c9211e54283cd21627e39a77d1d1af90efc5358c2"} Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.411489 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerStarted","Data":"64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe"} Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.411651 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="ceilometer-central-agent" containerID="cri-o://901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169" gracePeriod=30 Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.411728 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.411761 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="proxy-httpd" containerID="cri-o://64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe" gracePeriod=30 Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.411845 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="ceilometer-notification-agent" containerID="cri-o://7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978" gracePeriod=30 Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.411969 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="sg-core" containerID="cri-o://1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a" gracePeriod=30 Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.426903 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.426883688 podStartE2EDuration="5.426883688s" podCreationTimestamp="2026-02-15 17:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:49.420136673 +0000 UTC m=+1085.363544805" watchObservedRunningTime="2026-02-15 17:23:49.426883688 +0000 UTC m=+1085.370291820" Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.448721 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.647937094 podStartE2EDuration="8.448705712s" podCreationTimestamp="2026-02-15 17:23:41 +0000 UTC" firstStartedPulling="2026-02-15 17:23:43.22592239 +0000 UTC m=+1079.169330522" lastFinishedPulling="2026-02-15 17:23:48.026691008 +0000 UTC m=+1083.970099140" observedRunningTime="2026-02-15 17:23:49.44638998 +0000 UTC m=+1085.389798122" watchObservedRunningTime="2026-02-15 17:23:49.448705712 +0000 UTC m=+1085.392113844" Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.614488 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 15 17:23:49 crc kubenswrapper[4585]: I0215 17:23:49.940483 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.184:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.184:8443: connect: connection refused" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.204760 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.185:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.185:8443: connect: connection refused" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.422569 4585 generic.go:334] "Generic (PLEG): container finished" podID="24b8f662-ff47-41d2-b351-94bde6050902" containerID="64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe" exitCode=0 Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.422615 4585 generic.go:334] "Generic (PLEG): container finished" podID="24b8f662-ff47-41d2-b351-94bde6050902" containerID="1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a" exitCode=2 Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.422625 4585 generic.go:334] "Generic (PLEG): container finished" podID="24b8f662-ff47-41d2-b351-94bde6050902" containerID="7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978" exitCode=0 Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.422665 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerDied","Data":"64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe"} Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.422691 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerDied","Data":"1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a"} Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.422702 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerDied","Data":"7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978"} Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.424674 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c0fc65d-8f21-4e29-819a-06cb632e02cf","Type":"ContainerStarted","Data":"234eda72f2b580abfa7fad873bfdadb3f6aa979e213160b38c58daa1735d0f90"} Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.778188 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd86j"] Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.779810 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.782955 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.783158 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jw488" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.783341 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.788364 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd86j"] Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.879085 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-scripts\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.883399 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.884048 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-config-data\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.884445 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l464\" (UniqueName: \"kubernetes.io/projected/3921dd00-14a5-4825-b135-5acc7a95a162-kube-api-access-8l464\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.987978 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.988028 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-config-data\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.988063 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l464\" (UniqueName: \"kubernetes.io/projected/3921dd00-14a5-4825-b135-5acc7a95a162-kube-api-access-8l464\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.988105 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-scripts\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.992436 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.993652 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-scripts\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:50 crc kubenswrapper[4585]: I0215 17:23:50.994708 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-config-data\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:51 crc kubenswrapper[4585]: I0215 17:23:51.010176 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l464\" (UniqueName: \"kubernetes.io/projected/3921dd00-14a5-4825-b135-5acc7a95a162-kube-api-access-8l464\") pod \"nova-cell0-conductor-db-sync-kd86j\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:51 crc kubenswrapper[4585]: I0215 17:23:51.155458 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:23:51 crc kubenswrapper[4585]: I0215 17:23:51.460780 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c0fc65d-8f21-4e29-819a-06cb632e02cf","Type":"ContainerStarted","Data":"83c1e7dc7466abd382e482749b9d74a60bb3ffdfc194366ca2da75376e286709"} Feb 15 17:23:51 crc kubenswrapper[4585]: I0215 17:23:51.741553 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd86j"] Feb 15 17:23:52 crc kubenswrapper[4585]: I0215 17:23:52.471583 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c0fc65d-8f21-4e29-819a-06cb632e02cf","Type":"ContainerStarted","Data":"f63616f0c308d9ffd7d0030ac94bacdf246383bd10ab3e9c05edef22885b80d6"} Feb 15 17:23:52 crc kubenswrapper[4585]: I0215 17:23:52.476023 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kd86j" event={"ID":"3921dd00-14a5-4825-b135-5acc7a95a162","Type":"ContainerStarted","Data":"21204b8e4cf6536a03ba76d3264a9b47fc528848eef920fd906e9cb3f6119002"} Feb 15 17:23:52 crc kubenswrapper[4585]: I0215 17:23:52.490566 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.490555533 podStartE2EDuration="4.490555533s" podCreationTimestamp="2026-02-15 17:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:23:52.488574339 +0000 UTC m=+1088.431982471" watchObservedRunningTime="2026-02-15 17:23:52.490555533 +0000 UTC m=+1088.433963665" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.230020 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.271706 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtd2k\" (UniqueName: \"kubernetes.io/projected/24b8f662-ff47-41d2-b351-94bde6050902-kube-api-access-dtd2k\") pod \"24b8f662-ff47-41d2-b351-94bde6050902\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.271765 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-log-httpd\") pod \"24b8f662-ff47-41d2-b351-94bde6050902\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.271798 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-config-data\") pod \"24b8f662-ff47-41d2-b351-94bde6050902\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.271843 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-scripts\") pod \"24b8f662-ff47-41d2-b351-94bde6050902\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.271945 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-sg-core-conf-yaml\") pod \"24b8f662-ff47-41d2-b351-94bde6050902\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.272017 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-combined-ca-bundle\") pod \"24b8f662-ff47-41d2-b351-94bde6050902\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.272037 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-run-httpd\") pod \"24b8f662-ff47-41d2-b351-94bde6050902\" (UID: \"24b8f662-ff47-41d2-b351-94bde6050902\") " Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.274270 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "24b8f662-ff47-41d2-b351-94bde6050902" (UID: "24b8f662-ff47-41d2-b351-94bde6050902"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.274631 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "24b8f662-ff47-41d2-b351-94bde6050902" (UID: "24b8f662-ff47-41d2-b351-94bde6050902"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.284287 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b8f662-ff47-41d2-b351-94bde6050902-kube-api-access-dtd2k" (OuterVolumeSpecName: "kube-api-access-dtd2k") pod "24b8f662-ff47-41d2-b351-94bde6050902" (UID: "24b8f662-ff47-41d2-b351-94bde6050902"). InnerVolumeSpecName "kube-api-access-dtd2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.286885 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-scripts" (OuterVolumeSpecName: "scripts") pod "24b8f662-ff47-41d2-b351-94bde6050902" (UID: "24b8f662-ff47-41d2-b351-94bde6050902"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.372830 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "24b8f662-ff47-41d2-b351-94bde6050902" (UID: "24b8f662-ff47-41d2-b351-94bde6050902"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.374209 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.374228 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.374239 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtd2k\" (UniqueName: \"kubernetes.io/projected/24b8f662-ff47-41d2-b351-94bde6050902-kube-api-access-dtd2k\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.374252 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24b8f662-ff47-41d2-b351-94bde6050902-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.374260 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.428566 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24b8f662-ff47-41d2-b351-94bde6050902" (UID: "24b8f662-ff47-41d2-b351-94bde6050902"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.485959 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.492739 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-config-data" (OuterVolumeSpecName: "config-data") pod "24b8f662-ff47-41d2-b351-94bde6050902" (UID: "24b8f662-ff47-41d2-b351-94bde6050902"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.547805 4585 generic.go:334] "Generic (PLEG): container finished" podID="24b8f662-ff47-41d2-b351-94bde6050902" containerID="901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169" exitCode=0 Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.547853 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerDied","Data":"901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169"} Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.547880 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24b8f662-ff47-41d2-b351-94bde6050902","Type":"ContainerDied","Data":"a3aabb2d2b55b5f121501192bd8bcf6b007bc3480ab5e5f3df2013ea02c2b498"} Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.547897 4585 scope.go:117] "RemoveContainer" containerID="64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.548052 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.590213 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b8f662-ff47-41d2-b351-94bde6050902-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.621803 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.624339 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.627771 4585 scope.go:117] "RemoveContainer" containerID="1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.659847 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:54 crc kubenswrapper[4585]: E0215 17:23:54.671741 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="ceilometer-notification-agent" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.671777 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="ceilometer-notification-agent" Feb 15 17:23:54 crc kubenswrapper[4585]: E0215 17:23:54.671821 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="ceilometer-central-agent" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.671828 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="ceilometer-central-agent" Feb 15 17:23:54 crc kubenswrapper[4585]: E0215 17:23:54.671860 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="sg-core" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.671867 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="sg-core" Feb 15 17:23:54 crc kubenswrapper[4585]: E0215 17:23:54.671894 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="proxy-httpd" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.671900 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="proxy-httpd" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.672293 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="sg-core" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.672317 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="ceilometer-central-agent" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.672328 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="ceilometer-notification-agent" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.672339 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b8f662-ff47-41d2-b351-94bde6050902" containerName="proxy-httpd" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.690240 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.695184 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.695441 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.705911 4585 scope.go:117] "RemoveContainer" containerID="7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.764535 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.819015 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.821237 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-scripts\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.821335 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.821358 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-run-httpd\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.821409 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4m4w\" (UniqueName: \"kubernetes.io/projected/dcd5569f-6c47-4603-8ec4-0bc061b47e29-kube-api-access-x4m4w\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.821498 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-log-httpd\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.821532 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-config-data\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.842990 4585 scope.go:117] "RemoveContainer" containerID="901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.891959 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b8f662-ff47-41d2-b351-94bde6050902" path="/var/lib/kubelet/pods/24b8f662-ff47-41d2-b351-94bde6050902/volumes" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.899570 4585 scope.go:117] "RemoveContainer" containerID="64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe" Feb 15 17:23:54 crc kubenswrapper[4585]: E0215 17:23:54.900151 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe\": container with ID starting with 64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe not found: ID does not exist" containerID="64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.900200 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe"} err="failed to get container status \"64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe\": rpc error: code = NotFound desc = could not find container \"64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe\": container with ID starting with 64b94f2aa5db46674de72fd59b39063d775bfceeccea8269283893fd9f4f1ffe not found: ID does not exist" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.900225 4585 scope.go:117] "RemoveContainer" containerID="1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a" Feb 15 17:23:54 crc kubenswrapper[4585]: E0215 17:23:54.900502 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a\": container with ID starting with 1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a not found: ID does not exist" containerID="1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.900525 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a"} err="failed to get container status \"1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a\": rpc error: code = NotFound desc = could not find container \"1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a\": container with ID starting with 1fc8d99296ae57bc51b282ef7c74bdc8df446f38cdf5a324a4108062beb5353a not found: ID does not exist" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.900542 4585 scope.go:117] "RemoveContainer" containerID="7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978" Feb 15 17:23:54 crc kubenswrapper[4585]: E0215 17:23:54.900877 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978\": container with ID starting with 7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978 not found: ID does not exist" containerID="7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.900899 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978"} err="failed to get container status \"7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978\": rpc error: code = NotFound desc = could not find container \"7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978\": container with ID starting with 7cf26c1be668a12839d56812ea7078f42b9fe43c555cd8b8f89b42140ea9f978 not found: ID does not exist" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.900911 4585 scope.go:117] "RemoveContainer" containerID="901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169" Feb 15 17:23:54 crc kubenswrapper[4585]: E0215 17:23:54.901251 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169\": container with ID starting with 901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169 not found: ID does not exist" containerID="901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.901267 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169"} err="failed to get container status \"901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169\": rpc error: code = NotFound desc = could not find container \"901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169\": container with ID starting with 901bc2605fc4c98a0f5ce1b4ead1c742bfb04fda958e82d79557d8242692e169 not found: ID does not exist" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.926270 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-log-httpd\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.926321 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-config-data\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.926399 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.926446 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-scripts\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.926513 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.926531 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-run-httpd\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.926567 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4m4w\" (UniqueName: \"kubernetes.io/projected/dcd5569f-6c47-4603-8ec4-0bc061b47e29-kube-api-access-x4m4w\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.930615 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-log-httpd\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.931676 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-run-httpd\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.935124 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-config-data\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.940101 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.943061 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-scripts\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.954849 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:54 crc kubenswrapper[4585]: I0215 17:23:54.957027 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4m4w\" (UniqueName: \"kubernetes.io/projected/dcd5569f-6c47-4603-8ec4-0bc061b47e29-kube-api-access-x4m4w\") pod \"ceilometer-0\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " pod="openstack/ceilometer-0" Feb 15 17:23:55 crc kubenswrapper[4585]: I0215 17:23:55.016295 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:23:55 crc kubenswrapper[4585]: I0215 17:23:55.028717 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 15 17:23:55 crc kubenswrapper[4585]: I0215 17:23:55.028753 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 15 17:23:55 crc kubenswrapper[4585]: I0215 17:23:55.076320 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 15 17:23:55 crc kubenswrapper[4585]: I0215 17:23:55.120964 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 15 17:23:55 crc kubenswrapper[4585]: I0215 17:23:55.568175 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 15 17:23:55 crc kubenswrapper[4585]: I0215 17:23:55.568395 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 15 17:23:55 crc kubenswrapper[4585]: I0215 17:23:55.575448 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:55 crc kubenswrapper[4585]: W0215 17:23:55.580660 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd5569f_6c47_4603_8ec4_0bc061b47e29.slice/crio-b7f0136df35d48912c1e6dbf740724de43f0bbb7e3ab8ef9647b83bccdbf3055 WatchSource:0}: Error finding container b7f0136df35d48912c1e6dbf740724de43f0bbb7e3ab8ef9647b83bccdbf3055: Status 404 returned error can't find the container with id b7f0136df35d48912c1e6dbf740724de43f0bbb7e3ab8ef9647b83bccdbf3055 Feb 15 17:23:56 crc kubenswrapper[4585]: I0215 17:23:56.581342 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerStarted","Data":"bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2"} Feb 15 17:23:56 crc kubenswrapper[4585]: I0215 17:23:56.582568 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerStarted","Data":"b7f0136df35d48912c1e6dbf740724de43f0bbb7e3ab8ef9647b83bccdbf3055"} Feb 15 17:23:57 crc kubenswrapper[4585]: I0215 17:23:57.630221 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 15 17:23:57 crc kubenswrapper[4585]: I0215 17:23:57.630674 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 15 17:23:57 crc kubenswrapper[4585]: I0215 17:23:57.631626 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3682ca9c-f964-4e4b-ba4a-489a96ef3f65","Type":"ContainerStarted","Data":"6ada9b66f667ef6cfd0a95def1c13751c7f999d22df9ca46eea88c36e8df0349"} Feb 15 17:23:57 crc kubenswrapper[4585]: I0215 17:23:57.648395 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.887581914 podStartE2EDuration="33.64838256s" podCreationTimestamp="2026-02-15 17:23:24 +0000 UTC" firstStartedPulling="2026-02-15 17:23:25.582905798 +0000 UTC m=+1061.526313930" lastFinishedPulling="2026-02-15 17:23:56.343706444 +0000 UTC m=+1092.287114576" observedRunningTime="2026-02-15 17:23:57.643513018 +0000 UTC m=+1093.586921150" watchObservedRunningTime="2026-02-15 17:23:57.64838256 +0000 UTC m=+1093.591790682" Feb 15 17:23:58 crc kubenswrapper[4585]: I0215 17:23:58.665261 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerStarted","Data":"d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530"} Feb 15 17:23:58 crc kubenswrapper[4585]: I0215 17:23:58.913940 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:58 crc kubenswrapper[4585]: I0215 17:23:58.913981 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:59 crc kubenswrapper[4585]: I0215 17:23:59.013726 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:59 crc kubenswrapper[4585]: I0215 17:23:59.014215 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:59 crc kubenswrapper[4585]: I0215 17:23:59.696640 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerStarted","Data":"bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae"} Feb 15 17:23:59 crc kubenswrapper[4585]: I0215 17:23:59.696846 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:59 crc kubenswrapper[4585]: I0215 17:23:59.698328 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 15 17:23:59 crc kubenswrapper[4585]: I0215 17:23:59.754502 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:23:59 crc kubenswrapper[4585]: I0215 17:23:59.933107 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.184:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.184:8443: connect: connection refused" Feb 15 17:24:00 crc kubenswrapper[4585]: I0215 17:24:00.183828 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.185:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.185:8443: connect: connection refused" Feb 15 17:24:00 crc kubenswrapper[4585]: I0215 17:24:00.183909 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:24:00 crc kubenswrapper[4585]: I0215 17:24:00.184639 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"d01e34d88d41c2f497ca91c0ab4c87883a6e19c0ac368b221e8e825d565e3a27"} pod="openstack/horizon-5fb7dd448-vc5x5" containerMessage="Container horizon failed startup probe, will be restarted" Feb 15 17:24:00 crc kubenswrapper[4585]: I0215 17:24:00.184678 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" containerID="cri-o://d01e34d88d41c2f497ca91c0ab4c87883a6e19c0ac368b221e8e825d565e3a27" gracePeriod=30 Feb 15 17:24:00 crc kubenswrapper[4585]: I0215 17:24:00.992661 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 15 17:24:00 crc kubenswrapper[4585]: I0215 17:24:00.992758 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 15 17:24:01 crc kubenswrapper[4585]: I0215 17:24:01.337204 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 15 17:24:04 crc kubenswrapper[4585]: I0215 17:24:04.767966 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 15 17:24:04 crc kubenswrapper[4585]: I0215 17:24:04.768511 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 15 17:24:04 crc kubenswrapper[4585]: I0215 17:24:04.816860 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 15 17:24:09 crc kubenswrapper[4585]: E0215 17:24:09.225061 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Feb 15 17:24:09 crc kubenswrapper[4585]: E0215 17:24:09.225548 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8l464,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-kd86j_openstack(3921dd00-14a5-4825-b135-5acc7a95a162): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 15 17:24:09 crc kubenswrapper[4585]: E0215 17:24:09.226724 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-kd86j" podUID="3921dd00-14a5-4825-b135-5acc7a95a162" Feb 15 17:24:09 crc kubenswrapper[4585]: E0215 17:24:09.826031 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-kd86j" podUID="3921dd00-14a5-4825-b135-5acc7a95a162" Feb 15 17:24:10 crc kubenswrapper[4585]: I0215 17:24:10.835577 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerStarted","Data":"ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c"} Feb 15 17:24:10 crc kubenswrapper[4585]: I0215 17:24:10.835882 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 15 17:24:10 crc kubenswrapper[4585]: I0215 17:24:10.835795 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="ceilometer-central-agent" containerID="cri-o://bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2" gracePeriod=30 Feb 15 17:24:10 crc kubenswrapper[4585]: I0215 17:24:10.835983 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="proxy-httpd" containerID="cri-o://ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c" gracePeriod=30 Feb 15 17:24:10 crc kubenswrapper[4585]: I0215 17:24:10.836059 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="ceilometer-notification-agent" containerID="cri-o://d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530" gracePeriod=30 Feb 15 17:24:10 crc kubenswrapper[4585]: I0215 17:24:10.836095 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="sg-core" containerID="cri-o://bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae" gracePeriod=30 Feb 15 17:24:11 crc kubenswrapper[4585]: I0215 17:24:11.851147 4585 generic.go:334] "Generic (PLEG): container finished" podID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerID="ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c" exitCode=0 Feb 15 17:24:11 crc kubenswrapper[4585]: I0215 17:24:11.851727 4585 generic.go:334] "Generic (PLEG): container finished" podID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerID="bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae" exitCode=2 Feb 15 17:24:11 crc kubenswrapper[4585]: I0215 17:24:11.851744 4585 generic.go:334] "Generic (PLEG): container finished" podID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerID="d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530" exitCode=0 Feb 15 17:24:11 crc kubenswrapper[4585]: I0215 17:24:11.851235 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerDied","Data":"ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c"} Feb 15 17:24:11 crc kubenswrapper[4585]: I0215 17:24:11.851787 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerDied","Data":"bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae"} Feb 15 17:24:11 crc kubenswrapper[4585]: I0215 17:24:11.851805 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerDied","Data":"d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530"} Feb 15 17:24:12 crc kubenswrapper[4585]: I0215 17:24:12.883503 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:24:12 crc kubenswrapper[4585]: I0215 17:24:12.926064 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.499984248 podStartE2EDuration="18.926043874s" podCreationTimestamp="2026-02-15 17:23:54 +0000 UTC" firstStartedPulling="2026-02-15 17:23:55.586447906 +0000 UTC m=+1091.529856038" lastFinishedPulling="2026-02-15 17:24:10.012507532 +0000 UTC m=+1105.955915664" observedRunningTime="2026-02-15 17:24:10.889348356 +0000 UTC m=+1106.832756498" watchObservedRunningTime="2026-02-15 17:24:12.926043874 +0000 UTC m=+1108.869452006" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.501418 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.674327 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-log-httpd\") pod \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.674454 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-config-data\") pod \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.674550 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-scripts\") pod \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.674614 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4m4w\" (UniqueName: \"kubernetes.io/projected/dcd5569f-6c47-4603-8ec4-0bc061b47e29-kube-api-access-x4m4w\") pod \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.674683 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-run-httpd\") pod \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.674717 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-sg-core-conf-yaml\") pod \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.674771 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-combined-ca-bundle\") pod \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\" (UID: \"dcd5569f-6c47-4603-8ec4-0bc061b47e29\") " Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.674954 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dcd5569f-6c47-4603-8ec4-0bc061b47e29" (UID: "dcd5569f-6c47-4603-8ec4-0bc061b47e29"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.675972 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dcd5569f-6c47-4603-8ec4-0bc061b47e29" (UID: "dcd5569f-6c47-4603-8ec4-0bc061b47e29"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.676800 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.676824 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dcd5569f-6c47-4603-8ec4-0bc061b47e29-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.679729 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd5569f-6c47-4603-8ec4-0bc061b47e29-kube-api-access-x4m4w" (OuterVolumeSpecName: "kube-api-access-x4m4w") pod "dcd5569f-6c47-4603-8ec4-0bc061b47e29" (UID: "dcd5569f-6c47-4603-8ec4-0bc061b47e29"). InnerVolumeSpecName "kube-api-access-x4m4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.682233 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-scripts" (OuterVolumeSpecName: "scripts") pod "dcd5569f-6c47-4603-8ec4-0bc061b47e29" (UID: "dcd5569f-6c47-4603-8ec4-0bc061b47e29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.752040 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dcd5569f-6c47-4603-8ec4-0bc061b47e29" (UID: "dcd5569f-6c47-4603-8ec4-0bc061b47e29"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.780266 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.780295 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4m4w\" (UniqueName: \"kubernetes.io/projected/dcd5569f-6c47-4603-8ec4-0bc061b47e29-kube-api-access-x4m4w\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.780307 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.808704 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcd5569f-6c47-4603-8ec4-0bc061b47e29" (UID: "dcd5569f-6c47-4603-8ec4-0bc061b47e29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.843838 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-config-data" (OuterVolumeSpecName: "config-data") pod "dcd5569f-6c47-4603-8ec4-0bc061b47e29" (UID: "dcd5569f-6c47-4603-8ec4-0bc061b47e29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.879106 4585 generic.go:334] "Generic (PLEG): container finished" podID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerID="bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2" exitCode=0 Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.879147 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerDied","Data":"bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2"} Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.879173 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dcd5569f-6c47-4603-8ec4-0bc061b47e29","Type":"ContainerDied","Data":"b7f0136df35d48912c1e6dbf740724de43f0bbb7e3ab8ef9647b83bccdbf3055"} Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.879189 4585 scope.go:117] "RemoveContainer" containerID="ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.879319 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.883989 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.884019 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcd5569f-6c47-4603-8ec4-0bc061b47e29-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.911404 4585 scope.go:117] "RemoveContainer" containerID="bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.925731 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.953811 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.958796 4585 scope.go:117] "RemoveContainer" containerID="d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.965815 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:13 crc kubenswrapper[4585]: E0215 17:24:13.974903 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="ceilometer-notification-agent" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.975146 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="ceilometer-notification-agent" Feb 15 17:24:13 crc kubenswrapper[4585]: E0215 17:24:13.975225 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="proxy-httpd" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.975288 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="proxy-httpd" Feb 15 17:24:13 crc kubenswrapper[4585]: E0215 17:24:13.977145 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="sg-core" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.980484 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="sg-core" Feb 15 17:24:13 crc kubenswrapper[4585]: E0215 17:24:13.980637 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="ceilometer-central-agent" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.980698 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="ceilometer-central-agent" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.981113 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="proxy-httpd" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.981199 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="ceilometer-central-agent" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.981278 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="sg-core" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.981375 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" containerName="ceilometer-notification-agent" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.985384 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.991246 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 15 17:24:13 crc kubenswrapper[4585]: I0215 17:24:13.991579 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.010301 4585 scope.go:117] "RemoveContainer" containerID="bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.016129 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.031533 4585 scope.go:117] "RemoveContainer" containerID="ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c" Feb 15 17:24:14 crc kubenswrapper[4585]: E0215 17:24:14.032002 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c\": container with ID starting with ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c not found: ID does not exist" containerID="ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.032098 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c"} err="failed to get container status \"ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c\": rpc error: code = NotFound desc = could not find container \"ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c\": container with ID starting with ed8b7a88e862c396f7cc6dc915178cb265bdd8ccbd31efc25aba913a86db044c not found: ID does not exist" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.032178 4585 scope.go:117] "RemoveContainer" containerID="bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae" Feb 15 17:24:14 crc kubenswrapper[4585]: E0215 17:24:14.032411 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae\": container with ID starting with bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae not found: ID does not exist" containerID="bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.032486 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae"} err="failed to get container status \"bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae\": rpc error: code = NotFound desc = could not find container \"bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae\": container with ID starting with bcee9d8ed5fa0a93f09121c214321af3e3cbaa2913f62e2621e97a73e2e7a9ae not found: ID does not exist" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.032553 4585 scope.go:117] "RemoveContainer" containerID="d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530" Feb 15 17:24:14 crc kubenswrapper[4585]: E0215 17:24:14.032854 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530\": container with ID starting with d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530 not found: ID does not exist" containerID="d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.032971 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530"} err="failed to get container status \"d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530\": rpc error: code = NotFound desc = could not find container \"d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530\": container with ID starting with d95bd5029457a73c130e79731aafadd6f963392cc3970e299156650ee901c530 not found: ID does not exist" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.033068 4585 scope.go:117] "RemoveContainer" containerID="bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2" Feb 15 17:24:14 crc kubenswrapper[4585]: E0215 17:24:14.033308 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2\": container with ID starting with bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2 not found: ID does not exist" containerID="bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.033391 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2"} err="failed to get container status \"bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2\": rpc error: code = NotFound desc = could not find container \"bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2\": container with ID starting with bea83dda3cfaaa247f1b752133374f368565f8e2e3176d202b8a22ea0fdad9c2 not found: ID does not exist" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.087901 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.088316 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-scripts\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.088476 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.088669 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-config-data\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.088813 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-run-httpd\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.088966 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-log-httpd\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.089106 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxxxh\" (UniqueName: \"kubernetes.io/projected/c30c4941-af94-4fdc-b84e-f7761e1ca799-kube-api-access-qxxxh\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.190650 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-config-data\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.191421 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-run-httpd\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.191575 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-log-httpd\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.192300 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxxxh\" (UniqueName: \"kubernetes.io/projected/c30c4941-af94-4fdc-b84e-f7761e1ca799-kube-api-access-qxxxh\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.192715 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.192890 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-scripts\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.192991 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.192033 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-log-httpd\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.192224 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-run-httpd\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.195181 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-config-data\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.195582 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.198448 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-scripts\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.199250 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.209237 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxxxh\" (UniqueName: \"kubernetes.io/projected/c30c4941-af94-4fdc-b84e-f7761e1ca799-kube-api-access-qxxxh\") pod \"ceilometer-0\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.311741 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.814741 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.892222 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd5569f-6c47-4603-8ec4-0bc061b47e29" path="/var/lib/kubelet/pods/dcd5569f-6c47-4603-8ec4-0bc061b47e29/volumes" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.894337 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:24:14 crc kubenswrapper[4585]: I0215 17:24:14.902065 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerStarted","Data":"e617d23fa4d37946afb90dd5f104e5ac7908ed11f5df6e46fc3f9149bd6bd27c"} Feb 15 17:24:15 crc kubenswrapper[4585]: I0215 17:24:15.914585 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerStarted","Data":"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423"} Feb 15 17:24:16 crc kubenswrapper[4585]: I0215 17:24:16.923814 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerStarted","Data":"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230"} Feb 15 17:24:17 crc kubenswrapper[4585]: I0215 17:24:17.014569 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:24:17 crc kubenswrapper[4585]: I0215 17:24:17.014643 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:24:17 crc kubenswrapper[4585]: I0215 17:24:17.945949 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerStarted","Data":"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e"} Feb 15 17:24:18 crc kubenswrapper[4585]: I0215 17:24:18.957786 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerStarted","Data":"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82"} Feb 15 17:24:18 crc kubenswrapper[4585]: I0215 17:24:18.959365 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 15 17:24:22 crc kubenswrapper[4585]: I0215 17:24:22.869031 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.286699644 podStartE2EDuration="9.869011497s" podCreationTimestamp="2026-02-15 17:24:13 +0000 UTC" firstStartedPulling="2026-02-15 17:24:14.823699623 +0000 UTC m=+1110.767107755" lastFinishedPulling="2026-02-15 17:24:18.406011476 +0000 UTC m=+1114.349419608" observedRunningTime="2026-02-15 17:24:18.98693829 +0000 UTC m=+1114.930346412" watchObservedRunningTime="2026-02-15 17:24:22.869011497 +0000 UTC m=+1118.812419629" Feb 15 17:24:24 crc kubenswrapper[4585]: I0215 17:24:24.015145 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kd86j" event={"ID":"3921dd00-14a5-4825-b135-5acc7a95a162","Type":"ContainerStarted","Data":"af093457829ccbf6cf287f4fadffcdeb3aae7d3500b8e397a433515a18e4853f"} Feb 15 17:24:24 crc kubenswrapper[4585]: I0215 17:24:24.061232 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-kd86j" podStartSLOduration=2.511501281 podStartE2EDuration="34.06121617s" podCreationTimestamp="2026-02-15 17:23:50 +0000 UTC" firstStartedPulling="2026-02-15 17:23:51.753218098 +0000 UTC m=+1087.696626230" lastFinishedPulling="2026-02-15 17:24:23.302932987 +0000 UTC m=+1119.246341119" observedRunningTime="2026-02-15 17:24:24.056656496 +0000 UTC m=+1120.000064628" watchObservedRunningTime="2026-02-15 17:24:24.06121617 +0000 UTC m=+1120.004624302" Feb 15 17:24:24 crc kubenswrapper[4585]: I0215 17:24:24.878509 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:24 crc kubenswrapper[4585]: I0215 17:24:24.878776 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="ceilometer-central-agent" containerID="cri-o://590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423" gracePeriod=30 Feb 15 17:24:24 crc kubenswrapper[4585]: I0215 17:24:24.878904 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="proxy-httpd" containerID="cri-o://27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82" gracePeriod=30 Feb 15 17:24:24 crc kubenswrapper[4585]: I0215 17:24:24.878962 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="sg-core" containerID="cri-o://b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e" gracePeriod=30 Feb 15 17:24:24 crc kubenswrapper[4585]: I0215 17:24:24.879011 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="ceilometer-notification-agent" containerID="cri-o://02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230" gracePeriod=30 Feb 15 17:24:25 crc kubenswrapper[4585]: I0215 17:24:25.073939 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerDied","Data":"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e"} Feb 15 17:24:25 crc kubenswrapper[4585]: I0215 17:24:25.073789 4585 generic.go:334] "Generic (PLEG): container finished" podID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerID="b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e" exitCode=2 Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.642726 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.760378 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-log-httpd\") pod \"c30c4941-af94-4fdc-b84e-f7761e1ca799\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.760432 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-combined-ca-bundle\") pod \"c30c4941-af94-4fdc-b84e-f7761e1ca799\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.760459 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-run-httpd\") pod \"c30c4941-af94-4fdc-b84e-f7761e1ca799\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.760632 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-sg-core-conf-yaml\") pod \"c30c4941-af94-4fdc-b84e-f7761e1ca799\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.760726 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxxxh\" (UniqueName: \"kubernetes.io/projected/c30c4941-af94-4fdc-b84e-f7761e1ca799-kube-api-access-qxxxh\") pod \"c30c4941-af94-4fdc-b84e-f7761e1ca799\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.760770 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-config-data\") pod \"c30c4941-af94-4fdc-b84e-f7761e1ca799\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.760878 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-scripts\") pod \"c30c4941-af94-4fdc-b84e-f7761e1ca799\" (UID: \"c30c4941-af94-4fdc-b84e-f7761e1ca799\") " Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.761423 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c30c4941-af94-4fdc-b84e-f7761e1ca799" (UID: "c30c4941-af94-4fdc-b84e-f7761e1ca799"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.761668 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c30c4941-af94-4fdc-b84e-f7761e1ca799" (UID: "c30c4941-af94-4fdc-b84e-f7761e1ca799"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.774882 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30c4941-af94-4fdc-b84e-f7761e1ca799-kube-api-access-qxxxh" (OuterVolumeSpecName: "kube-api-access-qxxxh") pod "c30c4941-af94-4fdc-b84e-f7761e1ca799" (UID: "c30c4941-af94-4fdc-b84e-f7761e1ca799"). InnerVolumeSpecName "kube-api-access-qxxxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.778656 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-scripts" (OuterVolumeSpecName: "scripts") pod "c30c4941-af94-4fdc-b84e-f7761e1ca799" (UID: "c30c4941-af94-4fdc-b84e-f7761e1ca799"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.809003 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c30c4941-af94-4fdc-b84e-f7761e1ca799" (UID: "c30c4941-af94-4fdc-b84e-f7761e1ca799"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.856782 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c30c4941-af94-4fdc-b84e-f7761e1ca799" (UID: "c30c4941-af94-4fdc-b84e-f7761e1ca799"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.863576 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxxxh\" (UniqueName: \"kubernetes.io/projected/c30c4941-af94-4fdc-b84e-f7761e1ca799-kube-api-access-qxxxh\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.863669 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.863706 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.863717 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.863726 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c30c4941-af94-4fdc-b84e-f7761e1ca799-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.863738 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.893109 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-config-data" (OuterVolumeSpecName: "config-data") pod "c30c4941-af94-4fdc-b84e-f7761e1ca799" (UID: "c30c4941-af94-4fdc-b84e-f7761e1ca799"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:25.965714 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c30c4941-af94-4fdc-b84e-f7761e1ca799-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.094413 4585 generic.go:334] "Generic (PLEG): container finished" podID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerID="27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82" exitCode=0 Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.094460 4585 generic.go:334] "Generic (PLEG): container finished" podID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerID="02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230" exitCode=0 Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.094475 4585 generic.go:334] "Generic (PLEG): container finished" podID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerID="590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423" exitCode=0 Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.094507 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerDied","Data":"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82"} Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.094546 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerDied","Data":"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230"} Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.094567 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerDied","Data":"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423"} Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.094585 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c30c4941-af94-4fdc-b84e-f7761e1ca799","Type":"ContainerDied","Data":"e617d23fa4d37946afb90dd5f104e5ac7908ed11f5df6e46fc3f9149bd6bd27c"} Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.094637 4585 scope.go:117] "RemoveContainer" containerID="27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.094839 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.148992 4585 scope.go:117] "RemoveContainer" containerID="b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.161837 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.179949 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.195871 4585 scope.go:117] "RemoveContainer" containerID="02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.232207 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:26 crc kubenswrapper[4585]: E0215 17:24:26.232741 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="ceilometer-central-agent" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.232760 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="ceilometer-central-agent" Feb 15 17:24:26 crc kubenswrapper[4585]: E0215 17:24:26.232787 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="sg-core" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.232794 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="sg-core" Feb 15 17:24:26 crc kubenswrapper[4585]: E0215 17:24:26.232816 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="ceilometer-notification-agent" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.232823 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="ceilometer-notification-agent" Feb 15 17:24:26 crc kubenswrapper[4585]: E0215 17:24:26.232833 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="proxy-httpd" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.232839 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="proxy-httpd" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.233048 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="sg-core" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.233066 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="ceilometer-notification-agent" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.233072 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="ceilometer-central-agent" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.233081 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" containerName="proxy-httpd" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.235617 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.237365 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.241519 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.242829 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.274209 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-run-httpd\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.274291 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-log-httpd\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.274321 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc2f6\" (UniqueName: \"kubernetes.io/projected/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-kube-api-access-wc2f6\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.274395 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-scripts\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.274412 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.274439 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.274467 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-config-data\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.277483 4585 scope.go:117] "RemoveContainer" containerID="590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.300764 4585 scope.go:117] "RemoveContainer" containerID="27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82" Feb 15 17:24:26 crc kubenswrapper[4585]: E0215 17:24:26.301143 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82\": container with ID starting with 27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82 not found: ID does not exist" containerID="27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.301182 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82"} err="failed to get container status \"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82\": rpc error: code = NotFound desc = could not find container \"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82\": container with ID starting with 27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82 not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.301204 4585 scope.go:117] "RemoveContainer" containerID="b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e" Feb 15 17:24:26 crc kubenswrapper[4585]: E0215 17:24:26.301477 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e\": container with ID starting with b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e not found: ID does not exist" containerID="b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.301499 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e"} err="failed to get container status \"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e\": rpc error: code = NotFound desc = could not find container \"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e\": container with ID starting with b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.301511 4585 scope.go:117] "RemoveContainer" containerID="02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230" Feb 15 17:24:26 crc kubenswrapper[4585]: E0215 17:24:26.301731 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230\": container with ID starting with 02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230 not found: ID does not exist" containerID="02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.301753 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230"} err="failed to get container status \"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230\": rpc error: code = NotFound desc = could not find container \"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230\": container with ID starting with 02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230 not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.301766 4585 scope.go:117] "RemoveContainer" containerID="590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423" Feb 15 17:24:26 crc kubenswrapper[4585]: E0215 17:24:26.301942 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423\": container with ID starting with 590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423 not found: ID does not exist" containerID="590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.301961 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423"} err="failed to get container status \"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423\": rpc error: code = NotFound desc = could not find container \"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423\": container with ID starting with 590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423 not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.301974 4585 scope.go:117] "RemoveContainer" containerID="27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.302134 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82"} err="failed to get container status \"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82\": rpc error: code = NotFound desc = could not find container \"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82\": container with ID starting with 27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82 not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.302154 4585 scope.go:117] "RemoveContainer" containerID="b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.302481 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e"} err="failed to get container status \"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e\": rpc error: code = NotFound desc = could not find container \"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e\": container with ID starting with b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.302500 4585 scope.go:117] "RemoveContainer" containerID="02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.303293 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230"} err="failed to get container status \"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230\": rpc error: code = NotFound desc = could not find container \"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230\": container with ID starting with 02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230 not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.303313 4585 scope.go:117] "RemoveContainer" containerID="590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.303520 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423"} err="failed to get container status \"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423\": rpc error: code = NotFound desc = could not find container \"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423\": container with ID starting with 590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423 not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.303541 4585 scope.go:117] "RemoveContainer" containerID="27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.303769 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82"} err="failed to get container status \"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82\": rpc error: code = NotFound desc = could not find container \"27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82\": container with ID starting with 27d9333a6977899ee9ba7b53c66f45d78afb0d267a9342e9b88b3c1159bf5c82 not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.303787 4585 scope.go:117] "RemoveContainer" containerID="b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.303945 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e"} err="failed to get container status \"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e\": rpc error: code = NotFound desc = could not find container \"b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e\": container with ID starting with b7e70ed01eca689d38c9f4d3a7ccd6f678db98a584bc74ad67fb3dbda341c41e not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.303968 4585 scope.go:117] "RemoveContainer" containerID="02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.304182 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230"} err="failed to get container status \"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230\": rpc error: code = NotFound desc = could not find container \"02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230\": container with ID starting with 02360970b2349fa6b4643282ec113d73880a4a960bba150d8bf5a62f39f1c230 not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.304199 4585 scope.go:117] "RemoveContainer" containerID="590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.304378 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423"} err="failed to get container status \"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423\": rpc error: code = NotFound desc = could not find container \"590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423\": container with ID starting with 590ec9cfb1d69b062cb9cf108ab0e4697d6b1fabf2f3d97f1feeade94c289423 not found: ID does not exist" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.375281 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-log-httpd\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.375334 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc2f6\" (UniqueName: \"kubernetes.io/projected/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-kube-api-access-wc2f6\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.375405 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-scripts\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.375455 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.375490 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.375513 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-config-data\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.375553 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-run-httpd\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.376070 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-run-httpd\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.376789 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-log-httpd\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.380107 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.381973 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.389613 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-config-data\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.392788 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-scripts\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.393274 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc2f6\" (UniqueName: \"kubernetes.io/projected/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-kube-api-access-wc2f6\") pod \"ceilometer-0\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.549820 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:24:26 crc kubenswrapper[4585]: I0215 17:24:26.860431 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c30c4941-af94-4fdc-b84e-f7761e1ca799" path="/var/lib/kubelet/pods/c30c4941-af94-4fdc-b84e-f7761e1ca799/volumes" Feb 15 17:24:27 crc kubenswrapper[4585]: I0215 17:24:27.068194 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:24:27 crc kubenswrapper[4585]: W0215 17:24:27.077432 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcc8fd49_b2be_4e55_9e01_0ef1949e5c66.slice/crio-2459720c09485bed9a65f48d51cf95c313576020a548c7daa9ccba7236f939ff WatchSource:0}: Error finding container 2459720c09485bed9a65f48d51cf95c313576020a548c7daa9ccba7236f939ff: Status 404 returned error can't find the container with id 2459720c09485bed9a65f48d51cf95c313576020a548c7daa9ccba7236f939ff Feb 15 17:24:27 crc kubenswrapper[4585]: I0215 17:24:27.107903 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerStarted","Data":"2459720c09485bed9a65f48d51cf95c313576020a548c7daa9ccba7236f939ff"} Feb 15 17:24:28 crc kubenswrapper[4585]: I0215 17:24:28.187797 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerStarted","Data":"2ffd749467ec493fbe19ebd3cfde8329fc2182bf003da1bdad9d6bc7de5f2ae1"} Feb 15 17:24:29 crc kubenswrapper[4585]: I0215 17:24:29.207043 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerStarted","Data":"5bb9619bd93dbbdf1cd8c46a3bf6ad909a7d218dafc12b447182c1e4103cf556"} Feb 15 17:24:29 crc kubenswrapper[4585]: I0215 17:24:29.207500 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerStarted","Data":"2c159d4e934b99bb7cc26fd49aa67674f8490744043a941f631410a12f8581b9"} Feb 15 17:24:30 crc kubenswrapper[4585]: I0215 17:24:30.221112 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerStarted","Data":"1cdb88e45f977f5af4235e3d2f1482a82f1790c754743c80f0293f9eb4e4f95f"} Feb 15 17:24:30 crc kubenswrapper[4585]: I0215 17:24:30.221417 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 15 17:24:30 crc kubenswrapper[4585]: I0215 17:24:30.266097 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.472658499 podStartE2EDuration="4.266073448s" podCreationTimestamp="2026-02-15 17:24:26 +0000 UTC" firstStartedPulling="2026-02-15 17:24:27.079677842 +0000 UTC m=+1123.023085964" lastFinishedPulling="2026-02-15 17:24:29.873092771 +0000 UTC m=+1125.816500913" observedRunningTime="2026-02-15 17:24:30.254428132 +0000 UTC m=+1126.197836264" watchObservedRunningTime="2026-02-15 17:24:30.266073448 +0000 UTC m=+1126.209481590" Feb 15 17:24:31 crc kubenswrapper[4585]: I0215 17:24:31.235964 4585 generic.go:334] "Generic (PLEG): container finished" podID="b1bd46e7-0703-49b5-81f2-516568284547" containerID="d01e34d88d41c2f497ca91c0ab4c87883a6e19c0ac368b221e8e825d565e3a27" exitCode=137 Feb 15 17:24:31 crc kubenswrapper[4585]: I0215 17:24:31.236050 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb7dd448-vc5x5" event={"ID":"b1bd46e7-0703-49b5-81f2-516568284547","Type":"ContainerDied","Data":"d01e34d88d41c2f497ca91c0ab4c87883a6e19c0ac368b221e8e825d565e3a27"} Feb 15 17:24:31 crc kubenswrapper[4585]: I0215 17:24:31.236437 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fb7dd448-vc5x5" event={"ID":"b1bd46e7-0703-49b5-81f2-516568284547","Type":"ContainerStarted","Data":"80c273208e4a1ca6f7313c6c1c5bdcc9e248d5ea7bcb8e00bf6066df031452e8"} Feb 15 17:24:31 crc kubenswrapper[4585]: I0215 17:24:31.236488 4585 scope.go:117] "RemoveContainer" containerID="03cde210d7e2baae60bc76453feecb1542812009c51804c394167be427f59f34" Feb 15 17:24:35 crc kubenswrapper[4585]: I0215 17:24:35.285267 4585 generic.go:334] "Generic (PLEG): container finished" podID="3921dd00-14a5-4825-b135-5acc7a95a162" containerID="af093457829ccbf6cf287f4fadffcdeb3aae7d3500b8e397a433515a18e4853f" exitCode=0 Feb 15 17:24:35 crc kubenswrapper[4585]: I0215 17:24:35.285385 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kd86j" event={"ID":"3921dd00-14a5-4825-b135-5acc7a95a162","Type":"ContainerDied","Data":"af093457829ccbf6cf287f4fadffcdeb3aae7d3500b8e397a433515a18e4853f"} Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.805112 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.863200 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-scripts\") pod \"3921dd00-14a5-4825-b135-5acc7a95a162\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.863263 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-config-data\") pod \"3921dd00-14a5-4825-b135-5acc7a95a162\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.863287 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-combined-ca-bundle\") pod \"3921dd00-14a5-4825-b135-5acc7a95a162\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.863325 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l464\" (UniqueName: \"kubernetes.io/projected/3921dd00-14a5-4825-b135-5acc7a95a162-kube-api-access-8l464\") pod \"3921dd00-14a5-4825-b135-5acc7a95a162\" (UID: \"3921dd00-14a5-4825-b135-5acc7a95a162\") " Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.868429 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3921dd00-14a5-4825-b135-5acc7a95a162-kube-api-access-8l464" (OuterVolumeSpecName: "kube-api-access-8l464") pod "3921dd00-14a5-4825-b135-5acc7a95a162" (UID: "3921dd00-14a5-4825-b135-5acc7a95a162"). InnerVolumeSpecName "kube-api-access-8l464". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.891004 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-scripts" (OuterVolumeSpecName: "scripts") pod "3921dd00-14a5-4825-b135-5acc7a95a162" (UID: "3921dd00-14a5-4825-b135-5acc7a95a162"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.905226 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-config-data" (OuterVolumeSpecName: "config-data") pod "3921dd00-14a5-4825-b135-5acc7a95a162" (UID: "3921dd00-14a5-4825-b135-5acc7a95a162"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.911609 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3921dd00-14a5-4825-b135-5acc7a95a162" (UID: "3921dd00-14a5-4825-b135-5acc7a95a162"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.970130 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.970157 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.970167 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3921dd00-14a5-4825-b135-5acc7a95a162-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:36 crc kubenswrapper[4585]: I0215 17:24:36.970178 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l464\" (UniqueName: \"kubernetes.io/projected/3921dd00-14a5-4825-b135-5acc7a95a162-kube-api-access-8l464\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.311277 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-kd86j" event={"ID":"3921dd00-14a5-4825-b135-5acc7a95a162","Type":"ContainerDied","Data":"21204b8e4cf6536a03ba76d3264a9b47fc528848eef920fd906e9cb3f6119002"} Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.311316 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21204b8e4cf6536a03ba76d3264a9b47fc528848eef920fd906e9cb3f6119002" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.311368 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-kd86j" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.449792 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 15 17:24:37 crc kubenswrapper[4585]: E0215 17:24:37.450433 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3921dd00-14a5-4825-b135-5acc7a95a162" containerName="nova-cell0-conductor-db-sync" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.450448 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="3921dd00-14a5-4825-b135-5acc7a95a162" containerName="nova-cell0-conductor-db-sync" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.450685 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="3921dd00-14a5-4825-b135-5acc7a95a162" containerName="nova-cell0-conductor-db-sync" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.451330 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.464970 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.466063 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jw488" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.467255 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.479225 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254bedb1-5fad-4481-a643-4c7b6872eaf0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"254bedb1-5fad-4481-a643-4c7b6872eaf0\") " pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.479302 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254bedb1-5fad-4481-a643-4c7b6872eaf0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"254bedb1-5fad-4481-a643-4c7b6872eaf0\") " pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.479335 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgq8c\" (UniqueName: \"kubernetes.io/projected/254bedb1-5fad-4481-a643-4c7b6872eaf0-kube-api-access-lgq8c\") pod \"nova-cell0-conductor-0\" (UID: \"254bedb1-5fad-4481-a643-4c7b6872eaf0\") " pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.580636 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254bedb1-5fad-4481-a643-4c7b6872eaf0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"254bedb1-5fad-4481-a643-4c7b6872eaf0\") " pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.580741 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254bedb1-5fad-4481-a643-4c7b6872eaf0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"254bedb1-5fad-4481-a643-4c7b6872eaf0\") " pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.580775 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgq8c\" (UniqueName: \"kubernetes.io/projected/254bedb1-5fad-4481-a643-4c7b6872eaf0-kube-api-access-lgq8c\") pod \"nova-cell0-conductor-0\" (UID: \"254bedb1-5fad-4481-a643-4c7b6872eaf0\") " pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.592901 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/254bedb1-5fad-4481-a643-4c7b6872eaf0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"254bedb1-5fad-4481-a643-4c7b6872eaf0\") " pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.594428 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/254bedb1-5fad-4481-a643-4c7b6872eaf0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"254bedb1-5fad-4481-a643-4c7b6872eaf0\") " pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.605966 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgq8c\" (UniqueName: \"kubernetes.io/projected/254bedb1-5fad-4481-a643-4c7b6872eaf0-kube-api-access-lgq8c\") pod \"nova-cell0-conductor-0\" (UID: \"254bedb1-5fad-4481-a643-4c7b6872eaf0\") " pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:37 crc kubenswrapper[4585]: I0215 17:24:37.769232 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:38 crc kubenswrapper[4585]: I0215 17:24:38.262076 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 15 17:24:38 crc kubenswrapper[4585]: I0215 17:24:38.326739 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"254bedb1-5fad-4481-a643-4c7b6872eaf0","Type":"ContainerStarted","Data":"573c1d3fad40ae34b35040a24dea6f1ea78ae69673cb3c9e447eaba3ab116d0e"} Feb 15 17:24:39 crc kubenswrapper[4585]: I0215 17:24:39.338430 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"254bedb1-5fad-4481-a643-4c7b6872eaf0","Type":"ContainerStarted","Data":"9fa17f3220757d08f12a5927a8c1f8f8e8bd0fa6063f7f0f888a05eafb1da68f"} Feb 15 17:24:39 crc kubenswrapper[4585]: I0215 17:24:39.338832 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:39 crc kubenswrapper[4585]: I0215 17:24:39.366970 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.366950322 podStartE2EDuration="2.366950322s" podCreationTimestamp="2026-02-15 17:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:24:39.357176256 +0000 UTC m=+1135.300584408" watchObservedRunningTime="2026-02-15 17:24:39.366950322 +0000 UTC m=+1135.310358474" Feb 15 17:24:40 crc kubenswrapper[4585]: I0215 17:24:40.177472 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:24:40 crc kubenswrapper[4585]: I0215 17:24:40.178566 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:24:40 crc kubenswrapper[4585]: I0215 17:24:40.179130 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.185:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.185:8443: connect: connection refused" Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.013873 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.014373 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.014415 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.015093 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f05a1d156be3c851f680780cf5a4d67dc38d38043f804ea0899d1efe3927d68"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.015137 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://6f05a1d156be3c851f680780cf5a4d67dc38d38043f804ea0899d1efe3927d68" gracePeriod=600 Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.447408 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="6f05a1d156be3c851f680780cf5a4d67dc38d38043f804ea0899d1efe3927d68" exitCode=0 Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.447453 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"6f05a1d156be3c851f680780cf5a4d67dc38d38043f804ea0899d1efe3927d68"} Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.447479 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"cc021170b16c2c23261c62b89393adc6ccb7098259eaf76788eecda62fff1dea"} Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.447495 4585 scope.go:117] "RemoveContainer" containerID="0cc3960491aa7365ef9a992dbe57461170d7a99f094dd61fc5fce5575354ba90" Feb 15 17:24:47 crc kubenswrapper[4585]: I0215 17:24:47.802107 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.348734 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-rrsg2"] Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.350841 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.361748 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rrsg2"] Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.364501 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.365561 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.464737 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-scripts\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.465071 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-config-data\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.465221 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25sdq\" (UniqueName: \"kubernetes.io/projected/77018408-f0fc-4655-904c-9090777a235e-kube-api-access-25sdq\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.465670 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.480955 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.490132 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.510570 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.519984 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.566892 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gw9k\" (UniqueName: \"kubernetes.io/projected/6322cc42-9f69-4ed5-a933-4951d5fd8849-kube-api-access-4gw9k\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.566966 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.566996 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-scripts\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.567012 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-config-data\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.567031 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-config-data\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.567072 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.567107 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25sdq\" (UniqueName: \"kubernetes.io/projected/77018408-f0fc-4655-904c-9090777a235e-kube-api-access-25sdq\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.567156 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6322cc42-9f69-4ed5-a933-4951d5fd8849-logs\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.575519 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.576939 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-config-data\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.583121 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-scripts\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.632246 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25sdq\" (UniqueName: \"kubernetes.io/projected/77018408-f0fc-4655-904c-9090777a235e-kube-api-access-25sdq\") pod \"nova-cell0-cell-mapping-rrsg2\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.647248 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.649012 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.659044 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.671037 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.672152 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6322cc42-9f69-4ed5-a933-4951d5fd8849-logs\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.672214 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gw9k\" (UniqueName: \"kubernetes.io/projected/6322cc42-9f69-4ed5-a933-4951d5fd8849-kube-api-access-4gw9k\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.685222 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6322cc42-9f69-4ed5-a933-4951d5fd8849-logs\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.685734 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-config-data\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.685873 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.685944 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.702808 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.730006 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gw9k\" (UniqueName: \"kubernetes.io/projected/6322cc42-9f69-4ed5-a933-4951d5fd8849-kube-api-access-4gw9k\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.742224 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-config-data\") pod \"nova-api-0\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.787818 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-logs\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.787875 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.787972 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-config-data\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.787987 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5jq7\" (UniqueName: \"kubernetes.io/projected/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-kube-api-access-d5jq7\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.794650 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.796086 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.810578 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.833873 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.843076 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.890287 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " pod="openstack/nova-scheduler-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.890360 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-logs\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.890400 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.890429 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-config-data\") pod \"nova-scheduler-0\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " pod="openstack/nova-scheduler-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.890497 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf2jv\" (UniqueName: \"kubernetes.io/projected/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-kube-api-access-kf2jv\") pod \"nova-scheduler-0\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " pod="openstack/nova-scheduler-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.890535 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-config-data\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.890553 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5jq7\" (UniqueName: \"kubernetes.io/projected/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-kube-api-access-d5jq7\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.890822 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-logs\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.905919 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.914487 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-config-data\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.927099 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5jq7\" (UniqueName: \"kubernetes.io/projected/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-kube-api-access-d5jq7\") pod \"nova-metadata-0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " pod="openstack/nova-metadata-0" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.955174 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-45s2p"] Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.957074 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:48 crc kubenswrapper[4585]: I0215 17:24:48.965164 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-45s2p"] Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:48.995290 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-config-data\") pod \"nova-scheduler-0\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " pod="openstack/nova-scheduler-0" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:48.995383 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf2jv\" (UniqueName: \"kubernetes.io/projected/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-kube-api-access-kf2jv\") pod \"nova-scheduler-0\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " pod="openstack/nova-scheduler-0" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:48.995473 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " pod="openstack/nova-scheduler-0" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:48.999175 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " pod="openstack/nova-scheduler-0" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:48.999754 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-config-data\") pod \"nova-scheduler-0\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " pod="openstack/nova-scheduler-0" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.023229 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf2jv\" (UniqueName: \"kubernetes.io/projected/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-kube-api-access-kf2jv\") pod \"nova-scheduler-0\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " pod="openstack/nova-scheduler-0" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.098165 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.098230 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-svc\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.098250 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-config\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.098307 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqhsd\" (UniqueName: \"kubernetes.io/projected/02233ef6-8c79-4706-aaff-b246384695b8-kube-api-access-fqhsd\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.098376 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.098428 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.169748 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.196930 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.207112 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.207297 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.207322 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-svc\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.207341 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-config\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.207386 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqhsd\" (UniqueName: \"kubernetes.io/projected/02233ef6-8c79-4706-aaff-b246384695b8-kube-api-access-fqhsd\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.207448 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.208388 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.208912 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.210007 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.211813 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-svc\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.213913 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-config\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.269797 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqhsd\" (UniqueName: \"kubernetes.io/projected/02233ef6-8c79-4706-aaff-b246384695b8-kube-api-access-fqhsd\") pod \"dnsmasq-dns-bccf8f775-45s2p\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.304432 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.637466 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.667817 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rrsg2"] Feb 15 17:24:49 crc kubenswrapper[4585]: W0215 17:24:49.669771 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77018408_f0fc_4655_904c_9090777a235e.slice/crio-b797e37c7ca27af5784eb2a4ab4b6947c7488cfc62130b2bcea7cc990ab7eb45 WatchSource:0}: Error finding container b797e37c7ca27af5784eb2a4ab4b6947c7488cfc62130b2bcea7cc990ab7eb45: Status 404 returned error can't find the container with id b797e37c7ca27af5784eb2a4ab4b6947c7488cfc62130b2bcea7cc990ab7eb45 Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.944315 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:24:49 crc kubenswrapper[4585]: W0215 17:24:49.948850 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59e8bf4a_cb04_420e_bdfd_31ceabdccd36.slice/crio-0db62c71af9eab69cad0050ad0dd487cb2e9bdab86ee107402cb118be026eb8c WatchSource:0}: Error finding container 0db62c71af9eab69cad0050ad0dd487cb2e9bdab86ee107402cb118be026eb8c: Status 404 returned error can't find the container with id 0db62c71af9eab69cad0050ad0dd487cb2e9bdab86ee107402cb118be026eb8c Feb 15 17:24:49 crc kubenswrapper[4585]: I0215 17:24:49.961374 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.087104 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.177518 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fb7dd448-vc5x5" podUID="b1bd46e7-0703-49b5-81f2-516568284547" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.185:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.185:8443: connect: connection refused" Feb 15 17:24:50 crc kubenswrapper[4585]: W0215 17:24:50.233885 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02233ef6_8c79_4706_aaff_b246384695b8.slice/crio-9e653db19f3790b2775ba5c5e60d9f15240279ad90129bca62a58bf03c99c0c6 WatchSource:0}: Error finding container 9e653db19f3790b2775ba5c5e60d9f15240279ad90129bca62a58bf03c99c0c6: Status 404 returned error can't find the container with id 9e653db19f3790b2775ba5c5e60d9f15240279ad90129bca62a58bf03c99c0c6 Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.235492 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-45s2p"] Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.507000 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6322cc42-9f69-4ed5-a933-4951d5fd8849","Type":"ContainerStarted","Data":"b22cd95d8177628a2e5da7e08bea0b318fd491c8732cbae1a0a3021ea3738e5e"} Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.508442 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0","Type":"ContainerStarted","Data":"32b6b7453931f974b818518b3c2957bcca88f397db8009c29d3a558d695872ea"} Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.510103 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" event={"ID":"02233ef6-8c79-4706-aaff-b246384695b8","Type":"ContainerStarted","Data":"ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615"} Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.510127 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" event={"ID":"02233ef6-8c79-4706-aaff-b246384695b8","Type":"ContainerStarted","Data":"9e653db19f3790b2775ba5c5e60d9f15240279ad90129bca62a58bf03c99c0c6"} Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.514352 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rrsg2" event={"ID":"77018408-f0fc-4655-904c-9090777a235e","Type":"ContainerStarted","Data":"1713b69a41f76dfdc3517ef459910e24c774cfa58a29f1ef62d7bee5d631cafc"} Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.514382 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rrsg2" event={"ID":"77018408-f0fc-4655-904c-9090777a235e","Type":"ContainerStarted","Data":"b797e37c7ca27af5784eb2a4ab4b6947c7488cfc62130b2bcea7cc990ab7eb45"} Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.519480 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"59e8bf4a-cb04-420e-bdfd-31ceabdccd36","Type":"ContainerStarted","Data":"0db62c71af9eab69cad0050ad0dd487cb2e9bdab86ee107402cb118be026eb8c"} Feb 15 17:24:50 crc kubenswrapper[4585]: I0215 17:24:50.553877 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-rrsg2" podStartSLOduration=2.5538547339999997 podStartE2EDuration="2.553854734s" podCreationTimestamp="2026-02-15 17:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:24:50.545966179 +0000 UTC m=+1146.489374301" watchObservedRunningTime="2026-02-15 17:24:50.553854734 +0000 UTC m=+1146.497262866" Feb 15 17:24:51 crc kubenswrapper[4585]: I0215 17:24:51.530672 4585 generic.go:334] "Generic (PLEG): container finished" podID="02233ef6-8c79-4706-aaff-b246384695b8" containerID="ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615" exitCode=0 Feb 15 17:24:51 crc kubenswrapper[4585]: I0215 17:24:51.532152 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" event={"ID":"02233ef6-8c79-4706-aaff-b246384695b8","Type":"ContainerDied","Data":"ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615"} Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.590718 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"59e8bf4a-cb04-420e-bdfd-31ceabdccd36","Type":"ContainerStarted","Data":"6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194"} Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.595719 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6322cc42-9f69-4ed5-a933-4951d5fd8849","Type":"ContainerStarted","Data":"1e1c3ef6718d642344c236b87790c8c82fef0ca68d9270140f09da041a999b96"} Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.595765 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6322cc42-9f69-4ed5-a933-4951d5fd8849","Type":"ContainerStarted","Data":"d178ff6503e6e5ff6ea7eec75a4cbb7dfb0de4d4c55b4116304762823e7b4fa1"} Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.600301 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0","Type":"ContainerStarted","Data":"7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09"} Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.600327 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0","Type":"ContainerStarted","Data":"5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d"} Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.600421 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerName="nova-metadata-log" containerID="cri-o://5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d" gracePeriod=30 Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.600466 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerName="nova-metadata-metadata" containerID="cri-o://7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09" gracePeriod=30 Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.609721 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" event={"ID":"02233ef6-8c79-4706-aaff-b246384695b8","Type":"ContainerStarted","Data":"18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398"} Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.610322 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.621381 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.176323776 podStartE2EDuration="5.621336318s" podCreationTimestamp="2026-02-15 17:24:48 +0000 UTC" firstStartedPulling="2026-02-15 17:24:49.955021833 +0000 UTC m=+1145.898429965" lastFinishedPulling="2026-02-15 17:24:52.400034365 +0000 UTC m=+1148.343442507" observedRunningTime="2026-02-15 17:24:53.608128729 +0000 UTC m=+1149.551536861" watchObservedRunningTime="2026-02-15 17:24:53.621336318 +0000 UTC m=+1149.564744450" Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.634905 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.902230249 podStartE2EDuration="5.634885207s" podCreationTimestamp="2026-02-15 17:24:48 +0000 UTC" firstStartedPulling="2026-02-15 17:24:49.669829994 +0000 UTC m=+1145.613238126" lastFinishedPulling="2026-02-15 17:24:52.402484942 +0000 UTC m=+1148.345893084" observedRunningTime="2026-02-15 17:24:53.627834555 +0000 UTC m=+1149.571242687" watchObservedRunningTime="2026-02-15 17:24:53.634885207 +0000 UTC m=+1149.578293339" Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.653403 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.334814262 podStartE2EDuration="5.653386789s" podCreationTimestamp="2026-02-15 17:24:48 +0000 UTC" firstStartedPulling="2026-02-15 17:24:50.081073388 +0000 UTC m=+1146.024481520" lastFinishedPulling="2026-02-15 17:24:52.399645905 +0000 UTC m=+1148.343054047" observedRunningTime="2026-02-15 17:24:53.644708623 +0000 UTC m=+1149.588116755" watchObservedRunningTime="2026-02-15 17:24:53.653386789 +0000 UTC m=+1149.596794921" Feb 15 17:24:53 crc kubenswrapper[4585]: I0215 17:24:53.670591 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" podStartSLOduration=5.670563325 podStartE2EDuration="5.670563325s" podCreationTimestamp="2026-02-15 17:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:24:53.664347877 +0000 UTC m=+1149.607756009" watchObservedRunningTime="2026-02-15 17:24:53.670563325 +0000 UTC m=+1149.613971457" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.170700 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.170948 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.194502 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.201296 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.294303 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5jq7\" (UniqueName: \"kubernetes.io/projected/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-kube-api-access-d5jq7\") pod \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.294363 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-combined-ca-bundle\") pod \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.294397 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-config-data\") pod \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.294417 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-logs\") pod \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\" (UID: \"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0\") " Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.295712 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-logs" (OuterVolumeSpecName: "logs") pod "6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" (UID: "6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.307655 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-kube-api-access-d5jq7" (OuterVolumeSpecName: "kube-api-access-d5jq7") pod "6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" (UID: "6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0"). InnerVolumeSpecName "kube-api-access-d5jq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.333674 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" (UID: "6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.335067 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-config-data" (OuterVolumeSpecName: "config-data") pod "6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" (UID: "6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.396182 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5jq7\" (UniqueName: \"kubernetes.io/projected/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-kube-api-access-d5jq7\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.396215 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.396223 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.396234 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.622102 4585 generic.go:334] "Generic (PLEG): container finished" podID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerID="7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09" exitCode=0 Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.622130 4585 generic.go:334] "Generic (PLEG): container finished" podID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerID="5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d" exitCode=143 Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.622213 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.622263 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0","Type":"ContainerDied","Data":"7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09"} Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.622298 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0","Type":"ContainerDied","Data":"5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d"} Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.622311 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0","Type":"ContainerDied","Data":"32b6b7453931f974b818518b3c2957bcca88f397db8009c29d3a558d695872ea"} Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.622332 4585 scope.go:117] "RemoveContainer" containerID="7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.665992 4585 scope.go:117] "RemoveContainer" containerID="5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.667584 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.689187 4585 scope.go:117] "RemoveContainer" containerID="7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.689285 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:24:54 crc kubenswrapper[4585]: E0215 17:24:54.689646 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09\": container with ID starting with 7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09 not found: ID does not exist" containerID="7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.689676 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09"} err="failed to get container status \"7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09\": rpc error: code = NotFound desc = could not find container \"7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09\": container with ID starting with 7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09 not found: ID does not exist" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.689693 4585 scope.go:117] "RemoveContainer" containerID="5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d" Feb 15 17:24:54 crc kubenswrapper[4585]: E0215 17:24:54.690039 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d\": container with ID starting with 5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d not found: ID does not exist" containerID="5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.690068 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d"} err="failed to get container status \"5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d\": rpc error: code = NotFound desc = could not find container \"5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d\": container with ID starting with 5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d not found: ID does not exist" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.690083 4585 scope.go:117] "RemoveContainer" containerID="7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.693732 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09"} err="failed to get container status \"7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09\": rpc error: code = NotFound desc = could not find container \"7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09\": container with ID starting with 7de00e6f0d358fa92ba4bb33c44327cd72861465748c578c5a7b0ee72687db09 not found: ID does not exist" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.693754 4585 scope.go:117] "RemoveContainer" containerID="5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.694565 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d"} err="failed to get container status \"5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d\": rpc error: code = NotFound desc = could not find container \"5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d\": container with ID starting with 5cc38110b6be68fe442a7a97d292634e3ad4da90691f04f8efd207bfa0ea9a0d not found: ID does not exist" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.722646 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:24:54 crc kubenswrapper[4585]: E0215 17:24:54.723154 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerName="nova-metadata-log" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.723170 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerName="nova-metadata-log" Feb 15 17:24:54 crc kubenswrapper[4585]: E0215 17:24:54.723193 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerName="nova-metadata-metadata" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.723204 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerName="nova-metadata-metadata" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.723448 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerName="nova-metadata-log" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.723467 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" containerName="nova-metadata-metadata" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.724544 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.726982 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.727111 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.748592 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.804055 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rqw8\" (UniqueName: \"kubernetes.io/projected/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-kube-api-access-8rqw8\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.804120 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-config-data\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.804156 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.804296 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-logs\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.804469 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.862338 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0" path="/var/lib/kubelet/pods/6d61f3fa-1b50-4a55-bf6a-81e0366bf7c0/volumes" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.906216 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rqw8\" (UniqueName: \"kubernetes.io/projected/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-kube-api-access-8rqw8\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.906291 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-config-data\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.906330 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.906366 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-logs\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.906431 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.911274 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-logs\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.918524 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.920269 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.928236 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-config-data\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:54 crc kubenswrapper[4585]: I0215 17:24:54.928501 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rqw8\" (UniqueName: \"kubernetes.io/projected/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-kube-api-access-8rqw8\") pod \"nova-metadata-0\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " pod="openstack/nova-metadata-0" Feb 15 17:24:55 crc kubenswrapper[4585]: I0215 17:24:55.048755 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:24:55 crc kubenswrapper[4585]: I0215 17:24:55.603335 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:24:55 crc kubenswrapper[4585]: I0215 17:24:55.635022 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a","Type":"ContainerStarted","Data":"2714cb1dc7eb3cf135786d9200f6c3af3f46b40cc7ab6e988f33b38ab894f7dc"} Feb 15 17:24:56 crc kubenswrapper[4585]: I0215 17:24:56.566829 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 15 17:24:56 crc kubenswrapper[4585]: I0215 17:24:56.651271 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a","Type":"ContainerStarted","Data":"53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b"} Feb 15 17:24:56 crc kubenswrapper[4585]: I0215 17:24:56.651319 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a","Type":"ContainerStarted","Data":"7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f"} Feb 15 17:24:56 crc kubenswrapper[4585]: I0215 17:24:56.686082 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.686059858 podStartE2EDuration="2.686059858s" podCreationTimestamp="2026-02-15 17:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:24:56.679705416 +0000 UTC m=+1152.623113558" watchObservedRunningTime="2026-02-15 17:24:56.686059858 +0000 UTC m=+1152.629468000" Feb 15 17:24:58 crc kubenswrapper[4585]: I0215 17:24:58.674733 4585 generic.go:334] "Generic (PLEG): container finished" podID="77018408-f0fc-4655-904c-9090777a235e" containerID="1713b69a41f76dfdc3517ef459910e24c774cfa58a29f1ef62d7bee5d631cafc" exitCode=0 Feb 15 17:24:58 crc kubenswrapper[4585]: I0215 17:24:58.674790 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rrsg2" event={"ID":"77018408-f0fc-4655-904c-9090777a235e","Type":"ContainerDied","Data":"1713b69a41f76dfdc3517ef459910e24c774cfa58a29f1ef62d7bee5d631cafc"} Feb 15 17:24:58 crc kubenswrapper[4585]: I0215 17:24:58.901266 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 15 17:24:58 crc kubenswrapper[4585]: I0215 17:24:58.901303 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.202616 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.246444 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.306797 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.379366 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7gtqn"] Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.379588 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" podUID="23c49d13-f495-4583-8132-00a2af47b3ef" containerName="dnsmasq-dns" containerID="cri-o://f2b274b38fc6ee93651a962e16084929d27cf97caf323f160d1e8c32684ad320" gracePeriod=10 Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.705014 4585 generic.go:334] "Generic (PLEG): container finished" podID="23c49d13-f495-4583-8132-00a2af47b3ef" containerID="f2b274b38fc6ee93651a962e16084929d27cf97caf323f160d1e8c32684ad320" exitCode=0 Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.705959 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" event={"ID":"23c49d13-f495-4583-8132-00a2af47b3ef","Type":"ContainerDied","Data":"f2b274b38fc6ee93651a962e16084929d27cf97caf323f160d1e8c32684ad320"} Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.822812 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.887904 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.224:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.932506 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.224:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 15 17:24:59 crc kubenswrapper[4585]: I0215 17:24:59.982631 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.049268 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.050566 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.136409 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-sb\") pod \"23c49d13-f495-4583-8132-00a2af47b3ef\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.136441 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-nb\") pod \"23c49d13-f495-4583-8132-00a2af47b3ef\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.136487 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wblvs\" (UniqueName: \"kubernetes.io/projected/23c49d13-f495-4583-8132-00a2af47b3ef-kube-api-access-wblvs\") pod \"23c49d13-f495-4583-8132-00a2af47b3ef\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.136545 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-config\") pod \"23c49d13-f495-4583-8132-00a2af47b3ef\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.136562 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-svc\") pod \"23c49d13-f495-4583-8132-00a2af47b3ef\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.136631 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-swift-storage-0\") pod \"23c49d13-f495-4583-8132-00a2af47b3ef\" (UID: \"23c49d13-f495-4583-8132-00a2af47b3ef\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.159758 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c49d13-f495-4583-8132-00a2af47b3ef-kube-api-access-wblvs" (OuterVolumeSpecName: "kube-api-access-wblvs") pod "23c49d13-f495-4583-8132-00a2af47b3ef" (UID: "23c49d13-f495-4583-8132-00a2af47b3ef"). InnerVolumeSpecName "kube-api-access-wblvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.173125 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.211206 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "23c49d13-f495-4583-8132-00a2af47b3ef" (UID: "23c49d13-f495-4583-8132-00a2af47b3ef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.217140 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-config" (OuterVolumeSpecName: "config") pod "23c49d13-f495-4583-8132-00a2af47b3ef" (UID: "23c49d13-f495-4583-8132-00a2af47b3ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.236109 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "23c49d13-f495-4583-8132-00a2af47b3ef" (UID: "23c49d13-f495-4583-8132-00a2af47b3ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.239158 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "23c49d13-f495-4583-8132-00a2af47b3ef" (UID: "23c49d13-f495-4583-8132-00a2af47b3ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.239510 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.239536 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.239545 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wblvs\" (UniqueName: \"kubernetes.io/projected/23c49d13-f495-4583-8132-00a2af47b3ef-kube-api-access-wblvs\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.239557 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.239567 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.244703 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "23c49d13-f495-4583-8132-00a2af47b3ef" (UID: "23c49d13-f495-4583-8132-00a2af47b3ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.340349 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25sdq\" (UniqueName: \"kubernetes.io/projected/77018408-f0fc-4655-904c-9090777a235e-kube-api-access-25sdq\") pod \"77018408-f0fc-4655-904c-9090777a235e\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.340544 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-config-data\") pod \"77018408-f0fc-4655-904c-9090777a235e\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.340571 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-scripts\") pod \"77018408-f0fc-4655-904c-9090777a235e\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.340665 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-combined-ca-bundle\") pod \"77018408-f0fc-4655-904c-9090777a235e\" (UID: \"77018408-f0fc-4655-904c-9090777a235e\") " Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.341395 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23c49d13-f495-4583-8132-00a2af47b3ef-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.354026 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77018408-f0fc-4655-904c-9090777a235e-kube-api-access-25sdq" (OuterVolumeSpecName: "kube-api-access-25sdq") pod "77018408-f0fc-4655-904c-9090777a235e" (UID: "77018408-f0fc-4655-904c-9090777a235e"). InnerVolumeSpecName "kube-api-access-25sdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.357730 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-scripts" (OuterVolumeSpecName: "scripts") pod "77018408-f0fc-4655-904c-9090777a235e" (UID: "77018408-f0fc-4655-904c-9090777a235e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.373866 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77018408-f0fc-4655-904c-9090777a235e" (UID: "77018408-f0fc-4655-904c-9090777a235e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.385448 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-config-data" (OuterVolumeSpecName: "config-data") pod "77018408-f0fc-4655-904c-9090777a235e" (UID: "77018408-f0fc-4655-904c-9090777a235e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.443167 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.443369 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.443428 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77018408-f0fc-4655-904c-9090777a235e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.443483 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25sdq\" (UniqueName: \"kubernetes.io/projected/77018408-f0fc-4655-904c-9090777a235e-kube-api-access-25sdq\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.715882 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" event={"ID":"23c49d13-f495-4583-8132-00a2af47b3ef","Type":"ContainerDied","Data":"d2353f4c1499bfc03de18154f70ecf739efca16402336a7f574970354298626a"} Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.716211 4585 scope.go:117] "RemoveContainer" containerID="f2b274b38fc6ee93651a962e16084929d27cf97caf323f160d1e8c32684ad320" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.715905 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7gtqn" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.718578 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rrsg2" event={"ID":"77018408-f0fc-4655-904c-9090777a235e","Type":"ContainerDied","Data":"b797e37c7ca27af5784eb2a4ab4b6947c7488cfc62130b2bcea7cc990ab7eb45"} Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.718632 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b797e37c7ca27af5784eb2a4ab4b6947c7488cfc62130b2bcea7cc990ab7eb45" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.718891 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rrsg2" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.753032 4585 scope.go:117] "RemoveContainer" containerID="86eefa8cacc2f6c04f1346bdaeab6d4bff6962a3749ddccd09270a14cbe92abf" Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.789682 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7gtqn"] Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.797520 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7gtqn"] Feb 15 17:25:00 crc kubenswrapper[4585]: I0215 17:25:00.853191 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23c49d13-f495-4583-8132-00a2af47b3ef" path="/var/lib/kubelet/pods/23c49d13-f495-4583-8132-00a2af47b3ef/volumes" Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.005464 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.005693 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-log" containerID="cri-o://d178ff6503e6e5ff6ea7eec75a4cbb7dfb0de4d4c55b4116304762823e7b4fa1" gracePeriod=30 Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.005937 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-api" containerID="cri-o://1e1c3ef6718d642344c236b87790c8c82fef0ca68d9270140f09da041a999b96" gracePeriod=30 Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.027535 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.080203 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.080476 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a02ef963-4b80-427c-9f23-c033a729b944" containerName="kube-state-metrics" containerID="cri-o://b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd" gracePeriod=30 Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.112574 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.634667 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.760483 4585 generic.go:334] "Generic (PLEG): container finished" podID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerID="d178ff6503e6e5ff6ea7eec75a4cbb7dfb0de4d4c55b4116304762823e7b4fa1" exitCode=143 Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.760728 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6322cc42-9f69-4ed5-a933-4951d5fd8849","Type":"ContainerDied","Data":"d178ff6503e6e5ff6ea7eec75a4cbb7dfb0de4d4c55b4116304762823e7b4fa1"} Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.762079 4585 generic.go:334] "Generic (PLEG): container finished" podID="a02ef963-4b80-427c-9f23-c033a729b944" containerID="b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd" exitCode=2 Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.762187 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.762250 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="59e8bf4a-cb04-420e-bdfd-31ceabdccd36" containerName="nova-scheduler-scheduler" containerID="cri-o://6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194" gracePeriod=30 Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.762327 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a02ef963-4b80-427c-9f23-c033a729b944","Type":"ContainerDied","Data":"b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd"} Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.762353 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a02ef963-4b80-427c-9f23-c033a729b944","Type":"ContainerDied","Data":"36023e5f528efdbbcbb2fb52bb7be7343c5672eb1006890e5a2819332e385de1"} Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.762368 4585 scope.go:117] "RemoveContainer" containerID="b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd" Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.762525 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerName="nova-metadata-log" containerID="cri-o://7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f" gracePeriod=30 Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.762890 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerName="nova-metadata-metadata" containerID="cri-o://53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b" gracePeriod=30 Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.776364 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddwsn\" (UniqueName: \"kubernetes.io/projected/a02ef963-4b80-427c-9f23-c033a729b944-kube-api-access-ddwsn\") pod \"a02ef963-4b80-427c-9f23-c033a729b944\" (UID: \"a02ef963-4b80-427c-9f23-c033a729b944\") " Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.794838 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02ef963-4b80-427c-9f23-c033a729b944-kube-api-access-ddwsn" (OuterVolumeSpecName: "kube-api-access-ddwsn") pod "a02ef963-4b80-427c-9f23-c033a729b944" (UID: "a02ef963-4b80-427c-9f23-c033a729b944"). InnerVolumeSpecName "kube-api-access-ddwsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.851135 4585 scope.go:117] "RemoveContainer" containerID="b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd" Feb 15 17:25:01 crc kubenswrapper[4585]: E0215 17:25:01.851894 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd\": container with ID starting with b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd not found: ID does not exist" containerID="b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd" Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.851928 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd"} err="failed to get container status \"b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd\": rpc error: code = NotFound desc = could not find container \"b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd\": container with ID starting with b0fa641efba430d98ff39b8ab359dc8e902d024161363d4f10febbe38b6396cd not found: ID does not exist" Feb 15 17:25:01 crc kubenswrapper[4585]: I0215 17:25:01.879605 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddwsn\" (UniqueName: \"kubernetes.io/projected/a02ef963-4b80-427c-9f23-c033a729b944-kube-api-access-ddwsn\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.111646 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.128841 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.153311 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 15 17:25:02 crc kubenswrapper[4585]: E0215 17:25:02.153812 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02ef963-4b80-427c-9f23-c033a729b944" containerName="kube-state-metrics" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.153825 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02ef963-4b80-427c-9f23-c033a729b944" containerName="kube-state-metrics" Feb 15 17:25:02 crc kubenswrapper[4585]: E0215 17:25:02.153862 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c49d13-f495-4583-8132-00a2af47b3ef" containerName="dnsmasq-dns" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.153870 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c49d13-f495-4583-8132-00a2af47b3ef" containerName="dnsmasq-dns" Feb 15 17:25:02 crc kubenswrapper[4585]: E0215 17:25:02.153893 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77018408-f0fc-4655-904c-9090777a235e" containerName="nova-manage" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.153899 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="77018408-f0fc-4655-904c-9090777a235e" containerName="nova-manage" Feb 15 17:25:02 crc kubenswrapper[4585]: E0215 17:25:02.153925 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c49d13-f495-4583-8132-00a2af47b3ef" containerName="init" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.153931 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c49d13-f495-4583-8132-00a2af47b3ef" containerName="init" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.154139 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="77018408-f0fc-4655-904c-9090777a235e" containerName="nova-manage" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.154156 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02ef963-4b80-427c-9f23-c033a729b944" containerName="kube-state-metrics" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.154171 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c49d13-f495-4583-8132-00a2af47b3ef" containerName="dnsmasq-dns" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.159898 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.166389 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.178013 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.181117 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.290099 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/23bad4a7-c77d-4720-92df-7d126a0f079c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.290177 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23bad4a7-c77d-4720-92df-7d126a0f079c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.290221 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhzmq\" (UniqueName: \"kubernetes.io/projected/23bad4a7-c77d-4720-92df-7d126a0f079c-kube-api-access-bhzmq\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.290277 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/23bad4a7-c77d-4720-92df-7d126a0f079c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.309943 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.391336 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-logs\") pod \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.391411 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-combined-ca-bundle\") pod \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.391475 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rqw8\" (UniqueName: \"kubernetes.io/projected/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-kube-api-access-8rqw8\") pod \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.391510 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-config-data\") pod \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.391697 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-nova-metadata-tls-certs\") pod \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\" (UID: \"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a\") " Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.391998 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/23bad4a7-c77d-4720-92df-7d126a0f079c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.392067 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23bad4a7-c77d-4720-92df-7d126a0f079c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.392133 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhzmq\" (UniqueName: \"kubernetes.io/projected/23bad4a7-c77d-4720-92df-7d126a0f079c-kube-api-access-bhzmq\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.392208 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/23bad4a7-c77d-4720-92df-7d126a0f079c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.392699 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-logs" (OuterVolumeSpecName: "logs") pod "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" (UID: "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.400786 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/23bad4a7-c77d-4720-92df-7d126a0f079c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.401119 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-kube-api-access-8rqw8" (OuterVolumeSpecName: "kube-api-access-8rqw8") pod "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" (UID: "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a"). InnerVolumeSpecName "kube-api-access-8rqw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.401920 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23bad4a7-c77d-4720-92df-7d126a0f079c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.428968 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/23bad4a7-c77d-4720-92df-7d126a0f079c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.430913 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhzmq\" (UniqueName: \"kubernetes.io/projected/23bad4a7-c77d-4720-92df-7d126a0f079c-kube-api-access-bhzmq\") pod \"kube-state-metrics-0\" (UID: \"23bad4a7-c77d-4720-92df-7d126a0f079c\") " pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.435513 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" (UID: "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.442918 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-config-data" (OuterVolumeSpecName: "config-data") pod "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" (UID: "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.481084 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" (UID: "bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.496289 4585 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.496348 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.496361 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.496370 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rqw8\" (UniqueName: \"kubernetes.io/projected/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-kube-api-access-8rqw8\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.496378 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.509733 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.771911 4585 generic.go:334] "Generic (PLEG): container finished" podID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerID="53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b" exitCode=0 Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.771941 4585 generic.go:334] "Generic (PLEG): container finished" podID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerID="7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f" exitCode=143 Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.771988 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a","Type":"ContainerDied","Data":"53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b"} Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.772048 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a","Type":"ContainerDied","Data":"7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f"} Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.772061 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a","Type":"ContainerDied","Data":"2714cb1dc7eb3cf135786d9200f6c3af3f46b40cc7ab6e988f33b38ab894f7dc"} Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.772079 4585 scope.go:117] "RemoveContainer" containerID="53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.772265 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.812445 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.815889 4585 scope.go:117] "RemoveContainer" containerID="7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.827166 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.853363 4585 scope.go:117] "RemoveContainer" containerID="53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b" Feb 15 17:25:02 crc kubenswrapper[4585]: E0215 17:25:02.860697 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b\": container with ID starting with 53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b not found: ID does not exist" containerID="53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.860741 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b"} err="failed to get container status \"53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b\": rpc error: code = NotFound desc = could not find container \"53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b\": container with ID starting with 53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b not found: ID does not exist" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.860764 4585 scope.go:117] "RemoveContainer" containerID="7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f" Feb 15 17:25:02 crc kubenswrapper[4585]: E0215 17:25:02.864713 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f\": container with ID starting with 7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f not found: ID does not exist" containerID="7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.864739 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f"} err="failed to get container status \"7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f\": rpc error: code = NotFound desc = could not find container \"7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f\": container with ID starting with 7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f not found: ID does not exist" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.864753 4585 scope.go:117] "RemoveContainer" containerID="53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.867084 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a02ef963-4b80-427c-9f23-c033a729b944" path="/var/lib/kubelet/pods/a02ef963-4b80-427c-9f23-c033a729b944/volumes" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.867629 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" path="/var/lib/kubelet/pods/bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a/volumes" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.868222 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:25:02 crc kubenswrapper[4585]: E0215 17:25:02.868558 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerName="nova-metadata-log" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.868574 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerName="nova-metadata-log" Feb 15 17:25:02 crc kubenswrapper[4585]: E0215 17:25:02.869935 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerName="nova-metadata-metadata" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.869950 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerName="nova-metadata-metadata" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.870168 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerName="nova-metadata-log" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.870205 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7a7924-01ea-4d9c-ba7d-0d19dd78d43a" containerName="nova-metadata-metadata" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.871486 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.873142 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b"} err="failed to get container status \"53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b\": rpc error: code = NotFound desc = could not find container \"53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b\": container with ID starting with 53d76b5fc5a996af50446c7420dd6f3b084b8173f0dc2f234da2fe34f0f29a6b not found: ID does not exist" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.873161 4585 scope.go:117] "RemoveContainer" containerID="7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.873720 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f"} err="failed to get container status \"7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f\": rpc error: code = NotFound desc = could not find container \"7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f\": container with ID starting with 7c4a71b8883c1b62d82fb1f8266f7c013dc498d8972e2465e1ecb5b8d5d1b04f not found: ID does not exist" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.877098 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.877332 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 15 17:25:02 crc kubenswrapper[4585]: I0215 17:25:02.877704 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.017825 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-config-data\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.018163 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-logs\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.018255 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.018316 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.018373 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj49j\" (UniqueName: \"kubernetes.io/projected/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-kube-api-access-nj49j\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.119790 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.119888 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.119952 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj49j\" (UniqueName: \"kubernetes.io/projected/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-kube-api-access-nj49j\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.120011 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-config-data\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.120045 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-logs\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.120868 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-logs\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.124897 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.125375 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-config-data\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.126589 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.160392 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj49j\" (UniqueName: \"kubernetes.io/projected/f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3-kube-api-access-nj49j\") pod \"nova-metadata-0\" (UID: \"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3\") " pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.225661 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.304085 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.372909 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 15 17:25:03 crc kubenswrapper[4585]: W0215 17:25:03.377722 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23bad4a7_c77d_4720_92df_7d126a0f079c.slice/crio-4cf49b14ba53d8a9f78a70d0251b182802dc3478aa00772d3098253deffdbc59 WatchSource:0}: Error finding container 4cf49b14ba53d8a9f78a70d0251b182802dc3478aa00772d3098253deffdbc59: Status 404 returned error can't find the container with id 4cf49b14ba53d8a9f78a70d0251b182802dc3478aa00772d3098253deffdbc59 Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.411047 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.705339 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.787309 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3","Type":"ContainerStarted","Data":"1e4eb82240e62bdcc443dfcfa2b3244d961e9566cda1f4c1af2616fab859582b"} Feb 15 17:25:03 crc kubenswrapper[4585]: I0215 17:25:03.790380 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"23bad4a7-c77d-4720-92df-7d126a0f079c","Type":"ContainerStarted","Data":"4cf49b14ba53d8a9f78a70d0251b182802dc3478aa00772d3098253deffdbc59"} Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.154650 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.155354 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="ceilometer-central-agent" containerID="cri-o://2ffd749467ec493fbe19ebd3cfde8329fc2182bf003da1bdad9d6bc7de5f2ae1" gracePeriod=30 Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.155846 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="proxy-httpd" containerID="cri-o://1cdb88e45f977f5af4235e3d2f1482a82f1790c754743c80f0293f9eb4e4f95f" gracePeriod=30 Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.155967 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="ceilometer-notification-agent" containerID="cri-o://2c159d4e934b99bb7cc26fd49aa67674f8490744043a941f631410a12f8581b9" gracePeriod=30 Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.156068 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="sg-core" containerID="cri-o://5bb9619bd93dbbdf1cd8c46a3bf6ad909a7d218dafc12b447182c1e4103cf556" gracePeriod=30 Feb 15 17:25:04 crc kubenswrapper[4585]: E0215 17:25:04.203080 4585 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 15 17:25:04 crc kubenswrapper[4585]: E0215 17:25:04.208081 4585 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 15 17:25:04 crc kubenswrapper[4585]: E0215 17:25:04.209315 4585 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 15 17:25:04 crc kubenswrapper[4585]: E0215 17:25:04.209347 4585 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="59e8bf4a-cb04-420e-bdfd-31ceabdccd36" containerName="nova-scheduler-scheduler" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.680797 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.809665 4585 generic.go:334] "Generic (PLEG): container finished" podID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerID="1cdb88e45f977f5af4235e3d2f1482a82f1790c754743c80f0293f9eb4e4f95f" exitCode=0 Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.809707 4585 generic.go:334] "Generic (PLEG): container finished" podID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerID="5bb9619bd93dbbdf1cd8c46a3bf6ad909a7d218dafc12b447182c1e4103cf556" exitCode=2 Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.809719 4585 generic.go:334] "Generic (PLEG): container finished" podID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerID="2ffd749467ec493fbe19ebd3cfde8329fc2182bf003da1bdad9d6bc7de5f2ae1" exitCode=0 Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.809771 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerDied","Data":"1cdb88e45f977f5af4235e3d2f1482a82f1790c754743c80f0293f9eb4e4f95f"} Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.809804 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerDied","Data":"5bb9619bd93dbbdf1cd8c46a3bf6ad909a7d218dafc12b447182c1e4103cf556"} Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.809817 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerDied","Data":"2ffd749467ec493fbe19ebd3cfde8329fc2182bf003da1bdad9d6bc7de5f2ae1"} Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.813530 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3","Type":"ContainerStarted","Data":"b102f8345da2fe2a2a800e8c885af7d30695912f2f766412c62160e4275a000d"} Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.813572 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3","Type":"ContainerStarted","Data":"f64a290759533ba7122bd858d501a77cad27769147a578580d1fca04e44f77ed"} Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.816097 4585 generic.go:334] "Generic (PLEG): container finished" podID="59e8bf4a-cb04-420e-bdfd-31ceabdccd36" containerID="6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194" exitCode=0 Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.816178 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"59e8bf4a-cb04-420e-bdfd-31ceabdccd36","Type":"ContainerDied","Data":"6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194"} Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.816196 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"59e8bf4a-cb04-420e-bdfd-31ceabdccd36","Type":"ContainerDied","Data":"0db62c71af9eab69cad0050ad0dd487cb2e9bdab86ee107402cb118be026eb8c"} Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.816212 4585 scope.go:117] "RemoveContainer" containerID="6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.816347 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.819054 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"23bad4a7-c77d-4720-92df-7d126a0f079c","Type":"ContainerStarted","Data":"a86db61d8dfd4b46cf9aeb6d4f859c92b7ea461085619686b882e09d6dd4cff3"} Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.819671 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.839679 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.839658763 podStartE2EDuration="2.839658763s" podCreationTimestamp="2026-02-15 17:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:25:04.836287642 +0000 UTC m=+1160.779695774" watchObservedRunningTime="2026-02-15 17:25:04.839658763 +0000 UTC m=+1160.783066895" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.855779 4585 scope.go:117] "RemoveContainer" containerID="6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194" Feb 15 17:25:04 crc kubenswrapper[4585]: E0215 17:25:04.863994 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194\": container with ID starting with 6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194 not found: ID does not exist" containerID="6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.864042 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194"} err="failed to get container status \"6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194\": rpc error: code = NotFound desc = could not find container \"6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194\": container with ID starting with 6724b8e75472a38e4335fb532aefe6e4ca41471f405c1d57ccdec55476e35194 not found: ID does not exist" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.864100 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-config-data\") pod \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.864353 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf2jv\" (UniqueName: \"kubernetes.io/projected/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-kube-api-access-kf2jv\") pod \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.864451 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-combined-ca-bundle\") pod \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\" (UID: \"59e8bf4a-cb04-420e-bdfd-31ceabdccd36\") " Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.865415 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.190357402 podStartE2EDuration="2.864776276s" podCreationTimestamp="2026-02-15 17:25:02 +0000 UTC" firstStartedPulling="2026-02-15 17:25:03.410838052 +0000 UTC m=+1159.354246184" lastFinishedPulling="2026-02-15 17:25:04.085256926 +0000 UTC m=+1160.028665058" observedRunningTime="2026-02-15 17:25:04.863403369 +0000 UTC m=+1160.806811501" watchObservedRunningTime="2026-02-15 17:25:04.864776276 +0000 UTC m=+1160.808184408" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.873137 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-kube-api-access-kf2jv" (OuterVolumeSpecName: "kube-api-access-kf2jv") pod "59e8bf4a-cb04-420e-bdfd-31ceabdccd36" (UID: "59e8bf4a-cb04-420e-bdfd-31ceabdccd36"). InnerVolumeSpecName "kube-api-access-kf2jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.907117 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-config-data" (OuterVolumeSpecName: "config-data") pod "59e8bf4a-cb04-420e-bdfd-31ceabdccd36" (UID: "59e8bf4a-cb04-420e-bdfd-31ceabdccd36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.912945 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59e8bf4a-cb04-420e-bdfd-31ceabdccd36" (UID: "59e8bf4a-cb04-420e-bdfd-31ceabdccd36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.966996 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf2jv\" (UniqueName: \"kubernetes.io/projected/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-kube-api-access-kf2jv\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.967023 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:04 crc kubenswrapper[4585]: I0215 17:25:04.967032 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59e8bf4a-cb04-420e-bdfd-31ceabdccd36-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.183032 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.194034 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.207456 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 15 17:25:05 crc kubenswrapper[4585]: E0215 17:25:05.208194 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e8bf4a-cb04-420e-bdfd-31ceabdccd36" containerName="nova-scheduler-scheduler" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.208212 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e8bf4a-cb04-420e-bdfd-31ceabdccd36" containerName="nova-scheduler-scheduler" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.208431 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e8bf4a-cb04-420e-bdfd-31ceabdccd36" containerName="nova-scheduler-scheduler" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.209249 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.211273 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.224678 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.383002 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4170aae6-85c6-408f-a00b-2e3869fae11e-config-data\") pod \"nova-scheduler-0\" (UID: \"4170aae6-85c6-408f-a00b-2e3869fae11e\") " pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.383367 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4170aae6-85c6-408f-a00b-2e3869fae11e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4170aae6-85c6-408f-a00b-2e3869fae11e\") " pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.383501 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smjws\" (UniqueName: \"kubernetes.io/projected/4170aae6-85c6-408f-a00b-2e3869fae11e-kube-api-access-smjws\") pod \"nova-scheduler-0\" (UID: \"4170aae6-85c6-408f-a00b-2e3869fae11e\") " pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.442235 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5fb7dd448-vc5x5" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.486783 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4170aae6-85c6-408f-a00b-2e3869fae11e-config-data\") pod \"nova-scheduler-0\" (UID: \"4170aae6-85c6-408f-a00b-2e3869fae11e\") " pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.486878 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4170aae6-85c6-408f-a00b-2e3869fae11e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4170aae6-85c6-408f-a00b-2e3869fae11e\") " pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.487001 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smjws\" (UniqueName: \"kubernetes.io/projected/4170aae6-85c6-408f-a00b-2e3869fae11e-kube-api-access-smjws\") pod \"nova-scheduler-0\" (UID: \"4170aae6-85c6-408f-a00b-2e3869fae11e\") " pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.492233 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4170aae6-85c6-408f-a00b-2e3869fae11e-config-data\") pod \"nova-scheduler-0\" (UID: \"4170aae6-85c6-408f-a00b-2e3869fae11e\") " pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.492644 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4170aae6-85c6-408f-a00b-2e3869fae11e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4170aae6-85c6-408f-a00b-2e3869fae11e\") " pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.513117 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smjws\" (UniqueName: \"kubernetes.io/projected/4170aae6-85c6-408f-a00b-2e3869fae11e-kube-api-access-smjws\") pod \"nova-scheduler-0\" (UID: \"4170aae6-85c6-408f-a00b-2e3869fae11e\") " pod="openstack/nova-scheduler-0" Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.529463 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b9f5444b-8n6qh"] Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.529739 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon-log" containerID="cri-o://67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3" gracePeriod=30 Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.529871 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" containerID="cri-o://7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f" gracePeriod=30 Feb 15 17:25:05 crc kubenswrapper[4585]: I0215 17:25:05.530235 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 15 17:25:06 crc kubenswrapper[4585]: I0215 17:25:06.057498 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 15 17:25:06 crc kubenswrapper[4585]: W0215 17:25:06.059514 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4170aae6_85c6_408f_a00b_2e3869fae11e.slice/crio-b4c65602e978591629f696235bec1f0f66eb57d496e1c6e74efd796991d06192 WatchSource:0}: Error finding container b4c65602e978591629f696235bec1f0f66eb57d496e1c6e74efd796991d06192: Status 404 returned error can't find the container with id b4c65602e978591629f696235bec1f0f66eb57d496e1c6e74efd796991d06192 Feb 15 17:25:06 crc kubenswrapper[4585]: I0215 17:25:06.860937 4585 generic.go:334] "Generic (PLEG): container finished" podID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerID="1e1c3ef6718d642344c236b87790c8c82fef0ca68d9270140f09da041a999b96" exitCode=0 Feb 15 17:25:06 crc kubenswrapper[4585]: I0215 17:25:06.872306 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e8bf4a-cb04-420e-bdfd-31ceabdccd36" path="/var/lib/kubelet/pods/59e8bf4a-cb04-420e-bdfd-31ceabdccd36/volumes" Feb 15 17:25:06 crc kubenswrapper[4585]: I0215 17:25:06.875514 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6322cc42-9f69-4ed5-a933-4951d5fd8849","Type":"ContainerDied","Data":"1e1c3ef6718d642344c236b87790c8c82fef0ca68d9270140f09da041a999b96"} Feb 15 17:25:06 crc kubenswrapper[4585]: I0215 17:25:06.875562 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4170aae6-85c6-408f-a00b-2e3869fae11e","Type":"ContainerStarted","Data":"7fc07e27ebe201fe3f2c2eb3465469a67c8b7939cf0051a92abdeae78d247d28"} Feb 15 17:25:06 crc kubenswrapper[4585]: I0215 17:25:06.875575 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4170aae6-85c6-408f-a00b-2e3869fae11e","Type":"ContainerStarted","Data":"b4c65602e978591629f696235bec1f0f66eb57d496e1c6e74efd796991d06192"} Feb 15 17:25:06 crc kubenswrapper[4585]: I0215 17:25:06.917195 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.917172831 podStartE2EDuration="1.917172831s" podCreationTimestamp="2026-02-15 17:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:25:06.882644523 +0000 UTC m=+1162.826052665" watchObservedRunningTime="2026-02-15 17:25:06.917172831 +0000 UTC m=+1162.860580963" Feb 15 17:25:06 crc kubenswrapper[4585]: I0215 17:25:06.986297 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.117483 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-config-data\") pod \"6322cc42-9f69-4ed5-a933-4951d5fd8849\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.117532 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-combined-ca-bundle\") pod \"6322cc42-9f69-4ed5-a933-4951d5fd8849\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.117590 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6322cc42-9f69-4ed5-a933-4951d5fd8849-logs\") pod \"6322cc42-9f69-4ed5-a933-4951d5fd8849\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.117638 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gw9k\" (UniqueName: \"kubernetes.io/projected/6322cc42-9f69-4ed5-a933-4951d5fd8849-kube-api-access-4gw9k\") pod \"6322cc42-9f69-4ed5-a933-4951d5fd8849\" (UID: \"6322cc42-9f69-4ed5-a933-4951d5fd8849\") " Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.119022 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6322cc42-9f69-4ed5-a933-4951d5fd8849-logs" (OuterVolumeSpecName: "logs") pod "6322cc42-9f69-4ed5-a933-4951d5fd8849" (UID: "6322cc42-9f69-4ed5-a933-4951d5fd8849"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.137835 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6322cc42-9f69-4ed5-a933-4951d5fd8849-kube-api-access-4gw9k" (OuterVolumeSpecName: "kube-api-access-4gw9k") pod "6322cc42-9f69-4ed5-a933-4951d5fd8849" (UID: "6322cc42-9f69-4ed5-a933-4951d5fd8849"). InnerVolumeSpecName "kube-api-access-4gw9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.148798 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6322cc42-9f69-4ed5-a933-4951d5fd8849" (UID: "6322cc42-9f69-4ed5-a933-4951d5fd8849"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.150659 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-config-data" (OuterVolumeSpecName: "config-data") pod "6322cc42-9f69-4ed5-a933-4951d5fd8849" (UID: "6322cc42-9f69-4ed5-a933-4951d5fd8849"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.219979 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.220013 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6322cc42-9f69-4ed5-a933-4951d5fd8849-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.220023 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gw9k\" (UniqueName: \"kubernetes.io/projected/6322cc42-9f69-4ed5-a933-4951d5fd8849-kube-api-access-4gw9k\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.220034 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6322cc42-9f69-4ed5-a933-4951d5fd8849-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.888005 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.889009 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6322cc42-9f69-4ed5-a933-4951d5fd8849","Type":"ContainerDied","Data":"b22cd95d8177628a2e5da7e08bea0b318fd491c8732cbae1a0a3021ea3738e5e"} Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.889060 4585 scope.go:117] "RemoveContainer" containerID="1e1c3ef6718d642344c236b87790c8c82fef0ca68d9270140f09da041a999b96" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.928941 4585 scope.go:117] "RemoveContainer" containerID="d178ff6503e6e5ff6ea7eec75a4cbb7dfb0de4d4c55b4116304762823e7b4fa1" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.933894 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.943565 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.965530 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:07 crc kubenswrapper[4585]: E0215 17:25:07.966064 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-log" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.966083 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-log" Feb 15 17:25:07 crc kubenswrapper[4585]: E0215 17:25:07.966114 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-api" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.966123 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-api" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.966392 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-log" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.966410 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" containerName="nova-api-api" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.967886 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.971688 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 15 17:25:07 crc kubenswrapper[4585]: I0215 17:25:07.990377 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.037146 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.037360 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-config-data\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.037411 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef491757-9152-43ed-81fd-86baccc15079-logs\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.037636 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24r8k\" (UniqueName: \"kubernetes.io/projected/ef491757-9152-43ed-81fd-86baccc15079-kube-api-access-24r8k\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.140114 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.140218 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-config-data\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.140235 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef491757-9152-43ed-81fd-86baccc15079-logs\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.140278 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24r8k\" (UniqueName: \"kubernetes.io/projected/ef491757-9152-43ed-81fd-86baccc15079-kube-api-access-24r8k\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.140817 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef491757-9152-43ed-81fd-86baccc15079-logs\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.149747 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.152125 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-config-data\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.162013 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24r8k\" (UniqueName: \"kubernetes.io/projected/ef491757-9152-43ed-81fd-86baccc15079-kube-api-access-24r8k\") pod \"nova-api-0\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.225836 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.226046 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.284225 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.782388 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.855114 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6322cc42-9f69-4ed5-a933-4951d5fd8849" path="/var/lib/kubelet/pods/6322cc42-9f69-4ed5-a933-4951d5fd8849/volumes" Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.899155 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef491757-9152-43ed-81fd-86baccc15079","Type":"ContainerStarted","Data":"1bcdd64095cccacf9a283650c899982878649e075299bf314c9f4e437142e728"} Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.903132 4585 generic.go:334] "Generic (PLEG): container finished" podID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerID="7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f" exitCode=0 Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.903278 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9f5444b-8n6qh" event={"ID":"f443582a-cc67-48f1-a3e5-9ba6af0fbec5","Type":"ContainerDied","Data":"7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f"} Feb 15 17:25:08 crc kubenswrapper[4585]: I0215 17:25:08.903322 4585 scope.go:117] "RemoveContainer" containerID="e6a81c4c256ad7683acf59828ce484c3e06e042226f140242575aec7b2779784" Feb 15 17:25:09 crc kubenswrapper[4585]: I0215 17:25:09.922036 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef491757-9152-43ed-81fd-86baccc15079","Type":"ContainerStarted","Data":"5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48"} Feb 15 17:25:09 crc kubenswrapper[4585]: I0215 17:25:09.922707 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef491757-9152-43ed-81fd-86baccc15079","Type":"ContainerStarted","Data":"3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0"} Feb 15 17:25:09 crc kubenswrapper[4585]: I0215 17:25:09.932803 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.184:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.184:8443: connect: connection refused" Feb 15 17:25:10 crc kubenswrapper[4585]: I0215 17:25:10.534922 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 15 17:25:10 crc kubenswrapper[4585]: I0215 17:25:10.943149 4585 generic.go:334] "Generic (PLEG): container finished" podID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerID="2c159d4e934b99bb7cc26fd49aa67674f8490744043a941f631410a12f8581b9" exitCode=0 Feb 15 17:25:10 crc kubenswrapper[4585]: I0215 17:25:10.944232 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerDied","Data":"2c159d4e934b99bb7cc26fd49aa67674f8490744043a941f631410a12f8581b9"} Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.191139 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.233211 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.233194548 podStartE2EDuration="4.233194548s" podCreationTimestamp="2026-02-15 17:25:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:25:09.963152561 +0000 UTC m=+1165.906560703" watchObservedRunningTime="2026-02-15 17:25:11.233194548 +0000 UTC m=+1167.176602680" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.309617 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-run-httpd\") pod \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.309674 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc2f6\" (UniqueName: \"kubernetes.io/projected/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-kube-api-access-wc2f6\") pod \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.309717 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-config-data\") pod \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.309751 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-log-httpd\") pod \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.309838 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-scripts\") pod \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.309871 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-combined-ca-bundle\") pod \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.309936 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-sg-core-conf-yaml\") pod \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\" (UID: \"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66\") " Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.318734 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" (UID: "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.319667 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" (UID: "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.323942 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-scripts" (OuterVolumeSpecName: "scripts") pod "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" (UID: "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.328145 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-kube-api-access-wc2f6" (OuterVolumeSpecName: "kube-api-access-wc2f6") pod "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" (UID: "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66"). InnerVolumeSpecName "kube-api-access-wc2f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.371723 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" (UID: "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.412655 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.412692 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.412703 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.412712 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc2f6\" (UniqueName: \"kubernetes.io/projected/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-kube-api-access-wc2f6\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.412721 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.425484 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" (UID: "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.437964 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-config-data" (OuterVolumeSpecName: "config-data") pod "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" (UID: "bcc8fd49-b2be-4e55-9e01-0ef1949e5c66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.514421 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.514458 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.957853 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bcc8fd49-b2be-4e55-9e01-0ef1949e5c66","Type":"ContainerDied","Data":"2459720c09485bed9a65f48d51cf95c313576020a548c7daa9ccba7236f939ff"} Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.957938 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.958125 4585 scope.go:117] "RemoveContainer" containerID="1cdb88e45f977f5af4235e3d2f1482a82f1790c754743c80f0293f9eb4e4f95f" Feb 15 17:25:11 crc kubenswrapper[4585]: I0215 17:25:11.992344 4585 scope.go:117] "RemoveContainer" containerID="5bb9619bd93dbbdf1cd8c46a3bf6ad909a7d218dafc12b447182c1e4103cf556" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.014544 4585 scope.go:117] "RemoveContainer" containerID="2c159d4e934b99bb7cc26fd49aa67674f8490744043a941f631410a12f8581b9" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.044401 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.061935 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.064961 4585 scope.go:117] "RemoveContainer" containerID="2ffd749467ec493fbe19ebd3cfde8329fc2182bf003da1bdad9d6bc7de5f2ae1" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.072431 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:12 crc kubenswrapper[4585]: E0215 17:25:12.072892 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="ceilometer-notification-agent" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.072911 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="ceilometer-notification-agent" Feb 15 17:25:12 crc kubenswrapper[4585]: E0215 17:25:12.072931 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="ceilometer-central-agent" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.072937 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="ceilometer-central-agent" Feb 15 17:25:12 crc kubenswrapper[4585]: E0215 17:25:12.072953 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="proxy-httpd" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.072960 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="proxy-httpd" Feb 15 17:25:12 crc kubenswrapper[4585]: E0215 17:25:12.072972 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="sg-core" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.072978 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="sg-core" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.073287 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="proxy-httpd" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.073306 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="sg-core" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.073322 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="ceilometer-central-agent" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.073329 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" containerName="ceilometer-notification-agent" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.075182 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.077421 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.078989 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.079157 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.110899 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.136424 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-log-httpd\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.136489 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.136556 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.136613 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-config-data\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.136633 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-run-httpd\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.136656 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-scripts\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.136678 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr6rw\" (UniqueName: \"kubernetes.io/projected/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-kube-api-access-pr6rw\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.136713 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.238032 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-log-httpd\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.238109 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.238176 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.238223 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-config-data\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.238243 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-run-httpd\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.238265 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-scripts\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.238291 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr6rw\" (UniqueName: \"kubernetes.io/projected/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-kube-api-access-pr6rw\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.238328 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.238584 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-log-httpd\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.239068 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-run-httpd\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.242497 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.243138 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.243514 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.243744 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-scripts\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.245886 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-config-data\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.255297 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr6rw\" (UniqueName: \"kubernetes.io/projected/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-kube-api-access-pr6rw\") pod \"ceilometer-0\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.396541 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.537382 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.860077 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc8fd49-b2be-4e55-9e01-0ef1949e5c66" path="/var/lib/kubelet/pods/bcc8fd49-b2be-4e55-9e01-0ef1949e5c66/volumes" Feb 15 17:25:12 crc kubenswrapper[4585]: I0215 17:25:12.957178 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:13 crc kubenswrapper[4585]: I0215 17:25:13.226684 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 15 17:25:13 crc kubenswrapper[4585]: I0215 17:25:13.227178 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 15 17:25:13 crc kubenswrapper[4585]: I0215 17:25:13.984325 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerStarted","Data":"beee8caa26dba6c211051f228f46657a5f28aad7495ca323cb4cfe2b12ae0c4e"} Feb 15 17:25:13 crc kubenswrapper[4585]: I0215 17:25:13.984559 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerStarted","Data":"5e172a78b9723e0b85b0c829f679cbd685f7e3236a09772171c4c8378657e775"} Feb 15 17:25:14 crc kubenswrapper[4585]: I0215 17:25:14.244811 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 15 17:25:14 crc kubenswrapper[4585]: I0215 17:25:14.244830 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 15 17:25:15 crc kubenswrapper[4585]: I0215 17:25:15.011131 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerStarted","Data":"703e0d5f968411cbece03271dd8e3338fac53294eb8e88afc3f136c75752e30b"} Feb 15 17:25:15 crc kubenswrapper[4585]: I0215 17:25:15.534579 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 15 17:25:15 crc kubenswrapper[4585]: I0215 17:25:15.564865 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 15 17:25:16 crc kubenswrapper[4585]: I0215 17:25:16.022046 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerStarted","Data":"d08f4515a1aaef48c8ea0790614179feb21631473ac27e20fdb32cd58e18c162"} Feb 15 17:25:16 crc kubenswrapper[4585]: I0215 17:25:16.053330 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 15 17:25:18 crc kubenswrapper[4585]: I0215 17:25:18.040607 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerStarted","Data":"ae3a97f94ad84b3d1c0931457b054ac43ce70cce455aa37c5d3a5225330de15c"} Feb 15 17:25:18 crc kubenswrapper[4585]: I0215 17:25:18.041137 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 15 17:25:18 crc kubenswrapper[4585]: I0215 17:25:18.060560 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7517906490000001 podStartE2EDuration="6.060544239s" podCreationTimestamp="2026-02-15 17:25:12 +0000 UTC" firstStartedPulling="2026-02-15 17:25:12.975890328 +0000 UTC m=+1168.919298470" lastFinishedPulling="2026-02-15 17:25:17.284643908 +0000 UTC m=+1173.228052060" observedRunningTime="2026-02-15 17:25:18.057455346 +0000 UTC m=+1174.000863478" watchObservedRunningTime="2026-02-15 17:25:18.060544239 +0000 UTC m=+1174.003952371" Feb 15 17:25:18 crc kubenswrapper[4585]: I0215 17:25:18.285349 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 15 17:25:18 crc kubenswrapper[4585]: I0215 17:25:18.285402 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 15 17:25:19 crc kubenswrapper[4585]: I0215 17:25:19.368742 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.232:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 15 17:25:19 crc kubenswrapper[4585]: I0215 17:25:19.368784 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.232:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 15 17:25:19 crc kubenswrapper[4585]: I0215 17:25:19.934307 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.184:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.184:8443: connect: connection refused" Feb 15 17:25:23 crc kubenswrapper[4585]: I0215 17:25:23.236877 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 15 17:25:23 crc kubenswrapper[4585]: I0215 17:25:23.237785 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 15 17:25:23 crc kubenswrapper[4585]: I0215 17:25:23.254311 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 15 17:25:23 crc kubenswrapper[4585]: I0215 17:25:23.256269 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 15 17:25:28 crc kubenswrapper[4585]: I0215 17:25:28.290547 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 15 17:25:28 crc kubenswrapper[4585]: I0215 17:25:28.291350 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 15 17:25:28 crc kubenswrapper[4585]: I0215 17:25:28.293388 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 15 17:25:28 crc kubenswrapper[4585]: I0215 17:25:28.302208 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.177824 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.189361 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.422479 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mbrdb"] Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.426497 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.473541 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mbrdb"] Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.538384 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.538695 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.538721 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9t7m\" (UniqueName: \"kubernetes.io/projected/8982df7b-970d-41de-8ee6-2a71c14facb9-kube-api-access-f9t7m\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.538751 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.538776 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-config\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.538896 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.640245 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.640294 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.641272 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.641291 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.641312 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.641309 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.641368 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9t7m\" (UniqueName: \"kubernetes.io/projected/8982df7b-970d-41de-8ee6-2a71c14facb9-kube-api-access-f9t7m\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.641448 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.641503 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-config\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.642081 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-config\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.642543 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8982df7b-970d-41de-8ee6-2a71c14facb9-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.656358 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9t7m\" (UniqueName: \"kubernetes.io/projected/8982df7b-970d-41de-8ee6-2a71c14facb9-kube-api-access-f9t7m\") pod \"dnsmasq-dns-cd5cbd7b9-mbrdb\" (UID: \"8982df7b-970d-41de-8ee6-2a71c14facb9\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.774346 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.931765 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b9f5444b-8n6qh" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.184:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.184:8443: connect: connection refused" Feb 15 17:25:29 crc kubenswrapper[4585]: I0215 17:25:29.932105 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:25:30 crc kubenswrapper[4585]: I0215 17:25:30.348027 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mbrdb"] Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.200015 4585 generic.go:334] "Generic (PLEG): container finished" podID="8982df7b-970d-41de-8ee6-2a71c14facb9" containerID="f28c4783cf4c30bdd4116a2f128e39d59aad00cc3016b714ed3873ac67b49ba9" exitCode=0 Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.200468 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" event={"ID":"8982df7b-970d-41de-8ee6-2a71c14facb9","Type":"ContainerDied","Data":"f28c4783cf4c30bdd4116a2f128e39d59aad00cc3016b714ed3873ac67b49ba9"} Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.200609 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" event={"ID":"8982df7b-970d-41de-8ee6-2a71c14facb9","Type":"ContainerStarted","Data":"48d76476c530283f90f3775628b68cea55c71f70f170229722fedeb62da6b76d"} Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.840540 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.858909 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.859233 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="ceilometer-central-agent" containerID="cri-o://beee8caa26dba6c211051f228f46657a5f28aad7495ca323cb4cfe2b12ae0c4e" gracePeriod=30 Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.860088 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="proxy-httpd" containerID="cri-o://ae3a97f94ad84b3d1c0931457b054ac43ce70cce455aa37c5d3a5225330de15c" gracePeriod=30 Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.860176 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="sg-core" containerID="cri-o://d08f4515a1aaef48c8ea0790614179feb21631473ac27e20fdb32cd58e18c162" gracePeriod=30 Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.860220 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="ceilometer-notification-agent" containerID="cri-o://703e0d5f968411cbece03271dd8e3338fac53294eb8e88afc3f136c75752e30b" gracePeriod=30 Feb 15 17:25:31 crc kubenswrapper[4585]: I0215 17:25:31.873095 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.233:3000/\": EOF" Feb 15 17:25:32 crc kubenswrapper[4585]: I0215 17:25:32.214584 4585 generic.go:334] "Generic (PLEG): container finished" podID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerID="ae3a97f94ad84b3d1c0931457b054ac43ce70cce455aa37c5d3a5225330de15c" exitCode=0 Feb 15 17:25:32 crc kubenswrapper[4585]: I0215 17:25:32.214890 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerDied","Data":"ae3a97f94ad84b3d1c0931457b054ac43ce70cce455aa37c5d3a5225330de15c"} Feb 15 17:25:32 crc kubenswrapper[4585]: I0215 17:25:32.214954 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerDied","Data":"d08f4515a1aaef48c8ea0790614179feb21631473ac27e20fdb32cd58e18c162"} Feb 15 17:25:32 crc kubenswrapper[4585]: I0215 17:25:32.214916 4585 generic.go:334] "Generic (PLEG): container finished" podID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerID="d08f4515a1aaef48c8ea0790614179feb21631473ac27e20fdb32cd58e18c162" exitCode=2 Feb 15 17:25:32 crc kubenswrapper[4585]: I0215 17:25:32.216940 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-log" containerID="cri-o://3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0" gracePeriod=30 Feb 15 17:25:32 crc kubenswrapper[4585]: I0215 17:25:32.217501 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" event={"ID":"8982df7b-970d-41de-8ee6-2a71c14facb9","Type":"ContainerStarted","Data":"b5d6a82c2c8b88775dcdb42e83588bcb0eb1697f08bc83c3250f43391c50fb5c"} Feb 15 17:25:32 crc kubenswrapper[4585]: I0215 17:25:32.217536 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:32 crc kubenswrapper[4585]: I0215 17:25:32.217892 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-api" containerID="cri-o://5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48" gracePeriod=30 Feb 15 17:25:32 crc kubenswrapper[4585]: I0215 17:25:32.244977 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" podStartSLOduration=3.244960124 podStartE2EDuration="3.244960124s" podCreationTimestamp="2026-02-15 17:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:25:32.234967413 +0000 UTC m=+1188.178375545" watchObservedRunningTime="2026-02-15 17:25:32.244960124 +0000 UTC m=+1188.188368256" Feb 15 17:25:33 crc kubenswrapper[4585]: I0215 17:25:33.228746 4585 generic.go:334] "Generic (PLEG): container finished" podID="ef491757-9152-43ed-81fd-86baccc15079" containerID="3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0" exitCode=143 Feb 15 17:25:33 crc kubenswrapper[4585]: I0215 17:25:33.228831 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef491757-9152-43ed-81fd-86baccc15079","Type":"ContainerDied","Data":"3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0"} Feb 15 17:25:33 crc kubenswrapper[4585]: I0215 17:25:33.232472 4585 generic.go:334] "Generic (PLEG): container finished" podID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerID="beee8caa26dba6c211051f228f46657a5f28aad7495ca323cb4cfe2b12ae0c4e" exitCode=0 Feb 15 17:25:33 crc kubenswrapper[4585]: I0215 17:25:33.232645 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerDied","Data":"beee8caa26dba6c211051f228f46657a5f28aad7495ca323cb4cfe2b12ae0c4e"} Feb 15 17:25:35 crc kubenswrapper[4585]: I0215 17:25:35.898779 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:25:35 crc kubenswrapper[4585]: I0215 17:25:35.903777 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.019953 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-logs\") pod \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020011 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24r8k\" (UniqueName: \"kubernetes.io/projected/ef491757-9152-43ed-81fd-86baccc15079-kube-api-access-24r8k\") pod \"ef491757-9152-43ed-81fd-86baccc15079\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020082 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-combined-ca-bundle\") pod \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020110 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-config-data\") pod \"ef491757-9152-43ed-81fd-86baccc15079\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020587 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-logs" (OuterVolumeSpecName: "logs") pod "f443582a-cc67-48f1-a3e5-9ba6af0fbec5" (UID: "f443582a-cc67-48f1-a3e5-9ba6af0fbec5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020748 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfg8t\" (UniqueName: \"kubernetes.io/projected/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-kube-api-access-bfg8t\") pod \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020783 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-config-data\") pod \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020830 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-combined-ca-bundle\") pod \"ef491757-9152-43ed-81fd-86baccc15079\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020848 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-scripts\") pod \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020870 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-secret-key\") pod \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020937 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef491757-9152-43ed-81fd-86baccc15079-logs\") pod \"ef491757-9152-43ed-81fd-86baccc15079\" (UID: \"ef491757-9152-43ed-81fd-86baccc15079\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.020964 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-tls-certs\") pod \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\" (UID: \"f443582a-cc67-48f1-a3e5-9ba6af0fbec5\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.021374 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.022157 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef491757-9152-43ed-81fd-86baccc15079-logs" (OuterVolumeSpecName: "logs") pod "ef491757-9152-43ed-81fd-86baccc15079" (UID: "ef491757-9152-43ed-81fd-86baccc15079"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.029520 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f443582a-cc67-48f1-a3e5-9ba6af0fbec5" (UID: "f443582a-cc67-48f1-a3e5-9ba6af0fbec5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.038876 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef491757-9152-43ed-81fd-86baccc15079-kube-api-access-24r8k" (OuterVolumeSpecName: "kube-api-access-24r8k") pod "ef491757-9152-43ed-81fd-86baccc15079" (UID: "ef491757-9152-43ed-81fd-86baccc15079"). InnerVolumeSpecName "kube-api-access-24r8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.041708 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-kube-api-access-bfg8t" (OuterVolumeSpecName: "kube-api-access-bfg8t") pod "f443582a-cc67-48f1-a3e5-9ba6af0fbec5" (UID: "f443582a-cc67-48f1-a3e5-9ba6af0fbec5"). InnerVolumeSpecName "kube-api-access-bfg8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.067830 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f443582a-cc67-48f1-a3e5-9ba6af0fbec5" (UID: "f443582a-cc67-48f1-a3e5-9ba6af0fbec5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.110576 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef491757-9152-43ed-81fd-86baccc15079" (UID: "ef491757-9152-43ed-81fd-86baccc15079"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.123409 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.123626 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfg8t\" (UniqueName: \"kubernetes.io/projected/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-kube-api-access-bfg8t\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.123638 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.123647 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.123654 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef491757-9152-43ed-81fd-86baccc15079-logs\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.123664 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24r8k\" (UniqueName: \"kubernetes.io/projected/ef491757-9152-43ed-81fd-86baccc15079-kube-api-access-24r8k\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.151904 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-config-data" (OuterVolumeSpecName: "config-data") pod "f443582a-cc67-48f1-a3e5-9ba6af0fbec5" (UID: "f443582a-cc67-48f1-a3e5-9ba6af0fbec5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.152059 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-scripts" (OuterVolumeSpecName: "scripts") pod "f443582a-cc67-48f1-a3e5-9ba6af0fbec5" (UID: "f443582a-cc67-48f1-a3e5-9ba6af0fbec5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.162032 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-config-data" (OuterVolumeSpecName: "config-data") pod "ef491757-9152-43ed-81fd-86baccc15079" (UID: "ef491757-9152-43ed-81fd-86baccc15079"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.216165 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f443582a-cc67-48f1-a3e5-9ba6af0fbec5" (UID: "f443582a-cc67-48f1-a3e5-9ba6af0fbec5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.225800 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.225832 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.225842 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f443582a-cc67-48f1-a3e5-9ba6af0fbec5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.225853 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef491757-9152-43ed-81fd-86baccc15079-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.277100 4585 generic.go:334] "Generic (PLEG): container finished" podID="ef491757-9152-43ed-81fd-86baccc15079" containerID="5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48" exitCode=0 Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.277161 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef491757-9152-43ed-81fd-86baccc15079","Type":"ContainerDied","Data":"5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48"} Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.277186 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef491757-9152-43ed-81fd-86baccc15079","Type":"ContainerDied","Data":"1bcdd64095cccacf9a283650c899982878649e075299bf314c9f4e437142e728"} Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.277202 4585 scope.go:117] "RemoveContainer" containerID="5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.277323 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.283729 4585 generic.go:334] "Generic (PLEG): container finished" podID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerID="67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3" exitCode=137 Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.283787 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9f5444b-8n6qh" event={"ID":"f443582a-cc67-48f1-a3e5-9ba6af0fbec5","Type":"ContainerDied","Data":"67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3"} Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.283812 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b9f5444b-8n6qh" event={"ID":"f443582a-cc67-48f1-a3e5-9ba6af0fbec5","Type":"ContainerDied","Data":"1e231b6e9af7cbba5a9cd65de87dc8c740c6f93e341ba7c568cf49ae06364479"} Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.283986 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b9f5444b-8n6qh" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.296749 4585 generic.go:334] "Generic (PLEG): container finished" podID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerID="703e0d5f968411cbece03271dd8e3338fac53294eb8e88afc3f136c75752e30b" exitCode=0 Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.296784 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerDied","Data":"703e0d5f968411cbece03271dd8e3338fac53294eb8e88afc3f136c75752e30b"} Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.302550 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.318374 4585 scope.go:117] "RemoveContainer" containerID="3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.325563 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.371506 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.393331 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b9f5444b-8n6qh"] Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.419143 4585 scope.go:117] "RemoveContainer" containerID="5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.419242 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.420111 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420134 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.420172 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="proxy-httpd" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420178 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="proxy-httpd" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.420188 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="ceilometer-central-agent" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420194 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="ceilometer-central-agent" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.420204 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-log" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420210 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-log" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.420223 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="ceilometer-notification-agent" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420229 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="ceilometer-notification-agent" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.420237 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon-log" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420243 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon-log" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.420260 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-api" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420266 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-api" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.420274 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="sg-core" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420280 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="sg-core" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420491 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="sg-core" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420504 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-api" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420515 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon-log" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420526 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef491757-9152-43ed-81fd-86baccc15079" containerName="nova-api-log" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420536 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="ceilometer-central-agent" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420545 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420552 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420564 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="proxy-httpd" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420575 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" containerName="ceilometer-notification-agent" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.420780 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.420791 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" containerName="horizon" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.421842 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.423171 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48\": container with ID starting with 5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48 not found: ID does not exist" containerID="5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.423213 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48"} err="failed to get container status \"5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48\": rpc error: code = NotFound desc = could not find container \"5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48\": container with ID starting with 5923fe4ba438bac17c003c4ed2bfd53d518ed27291b3f0c66129b59379060d48 not found: ID does not exist" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.423236 4585 scope.go:117] "RemoveContainer" containerID="3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.425241 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.425566 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.425934 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.431485 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b9f5444b-8n6qh"] Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.433268 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-combined-ca-bundle\") pod \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.433315 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-run-httpd\") pod \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.433378 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-log-httpd\") pod \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.433411 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-config-data\") pod \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.433440 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr6rw\" (UniqueName: \"kubernetes.io/projected/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-kube-api-access-pr6rw\") pod \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.434002 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-scripts\") pod \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.434146 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-ceilometer-tls-certs\") pod \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.434213 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-sg-core-conf-yaml\") pod \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\" (UID: \"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e\") " Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.441858 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.441863 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0\": container with ID starting with 3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0 not found: ID does not exist" containerID="3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.441934 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0"} err="failed to get container status \"3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0\": rpc error: code = NotFound desc = could not find container \"3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0\": container with ID starting with 3994ddc2374dfe452e2cedb9f88dc4e8c6c84dbdbb4a30cfa33c918c365d68e0 not found: ID does not exist" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.441958 4585 scope.go:117] "RemoveContainer" containerID="7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.442307 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" (UID: "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.445925 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-scripts" (OuterVolumeSpecName: "scripts") pod "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" (UID: "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.450801 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-kube-api-access-pr6rw" (OuterVolumeSpecName: "kube-api-access-pr6rw") pod "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" (UID: "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e"). InnerVolumeSpecName "kube-api-access-pr6rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.458182 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" (UID: "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.530720 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" (UID: "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.532693 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" (UID: "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.540723 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.540766 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-public-tls-certs\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.540813 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-config-data\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.540904 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zjln\" (UniqueName: \"kubernetes.io/projected/61a2428b-0f64-47db-b464-cc21bac70b83-kube-api-access-2zjln\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.540940 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61a2428b-0f64-47db-b464-cc21bac70b83-logs\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.540992 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-internal-tls-certs\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.541047 4585 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.541056 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.541065 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.541072 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.541080 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr6rw\" (UniqueName: \"kubernetes.io/projected/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-kube-api-access-pr6rw\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.541088 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-scripts\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.642456 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.642505 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-public-tls-certs\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.642551 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-config-data\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.642638 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zjln\" (UniqueName: \"kubernetes.io/projected/61a2428b-0f64-47db-b464-cc21bac70b83-kube-api-access-2zjln\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.642665 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61a2428b-0f64-47db-b464-cc21bac70b83-logs\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.642718 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-internal-tls-certs\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.643428 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61a2428b-0f64-47db-b464-cc21bac70b83-logs\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.652700 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.652707 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-config-data\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.657254 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-internal-tls-certs\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.660545 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a2428b-0f64-47db-b464-cc21bac70b83-public-tls-certs\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.662551 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" (UID: "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.664101 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zjln\" (UniqueName: \"kubernetes.io/projected/61a2428b-0f64-47db-b464-cc21bac70b83-kube-api-access-2zjln\") pod \"nova-api-0\" (UID: \"61a2428b-0f64-47db-b464-cc21bac70b83\") " pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.670923 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-config-data" (OuterVolumeSpecName: "config-data") pod "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" (UID: "e4ccde36-8ed0-42b6-8e4c-150a913c8c2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.747226 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.747527 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.781098 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.854244 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef491757-9152-43ed-81fd-86baccc15079" path="/var/lib/kubelet/pods/ef491757-9152-43ed-81fd-86baccc15079/volumes" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.855086 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f443582a-cc67-48f1-a3e5-9ba6af0fbec5" path="/var/lib/kubelet/pods/f443582a-cc67-48f1-a3e5-9ba6af0fbec5/volumes" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.904495 4585 scope.go:117] "RemoveContainer" containerID="67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.932446 4585 scope.go:117] "RemoveContainer" containerID="7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.932860 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f\": container with ID starting with 7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f not found: ID does not exist" containerID="7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.932905 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f"} err="failed to get container status \"7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f\": rpc error: code = NotFound desc = could not find container \"7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f\": container with ID starting with 7eae7529745bf523db64a7777c59595d6570e9eb25b0c90ba47916da0e190d9f not found: ID does not exist" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.932930 4585 scope.go:117] "RemoveContainer" containerID="67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3" Feb 15 17:25:36 crc kubenswrapper[4585]: E0215 17:25:36.933245 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3\": container with ID starting with 67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3 not found: ID does not exist" containerID="67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3" Feb 15 17:25:36 crc kubenswrapper[4585]: I0215 17:25:36.933280 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3"} err="failed to get container status \"67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3\": rpc error: code = NotFound desc = could not find container \"67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3\": container with ID starting with 67c4bcce6de0b4afe004f38f6e541d9658ea1d100629492ae8cf46820563cfc3 not found: ID does not exist" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.312211 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4ccde36-8ed0-42b6-8e4c-150a913c8c2e","Type":"ContainerDied","Data":"5e172a78b9723e0b85b0c829f679cbd685f7e3236a09772171c4c8378657e775"} Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.312296 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.313133 4585 scope.go:117] "RemoveContainer" containerID="ae3a97f94ad84b3d1c0931457b054ac43ce70cce455aa37c5d3a5225330de15c" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.362254 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.372084 4585 scope.go:117] "RemoveContainer" containerID="d08f4515a1aaef48c8ea0790614179feb21631473ac27e20fdb32cd58e18c162" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.384527 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.391993 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.394735 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.397203 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.397345 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.397861 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.410214 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.429135 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.429485 4585 scope.go:117] "RemoveContainer" containerID="703e0d5f968411cbece03271dd8e3338fac53294eb8e88afc3f136c75752e30b" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.573416 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e37c5e7e-27a1-4ca2-a04d-588392a0f115-log-httpd\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.573581 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.573742 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.573798 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e37c5e7e-27a1-4ca2-a04d-588392a0f115-run-httpd\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.573830 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-scripts\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.573896 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlxvv\" (UniqueName: \"kubernetes.io/projected/e37c5e7e-27a1-4ca2-a04d-588392a0f115-kube-api-access-hlxvv\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.573935 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.574054 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-config-data\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.579308 4585 scope.go:117] "RemoveContainer" containerID="beee8caa26dba6c211051f228f46657a5f28aad7495ca323cb4cfe2b12ae0c4e" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.675760 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.675819 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.675857 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e37c5e7e-27a1-4ca2-a04d-588392a0f115-run-httpd\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.675889 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-scripts\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.675910 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlxvv\" (UniqueName: \"kubernetes.io/projected/e37c5e7e-27a1-4ca2-a04d-588392a0f115-kube-api-access-hlxvv\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.675927 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.676012 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-config-data\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.676054 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e37c5e7e-27a1-4ca2-a04d-588392a0f115-log-httpd\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.676495 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e37c5e7e-27a1-4ca2-a04d-588392a0f115-log-httpd\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.678317 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e37c5e7e-27a1-4ca2-a04d-588392a0f115-run-httpd\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.683783 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-config-data\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.685895 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-scripts\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.686208 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.686336 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.703886 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlxvv\" (UniqueName: \"kubernetes.io/projected/e37c5e7e-27a1-4ca2-a04d-588392a0f115-kube-api-access-hlxvv\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.705696 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37c5e7e-27a1-4ca2-a04d-588392a0f115-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e37c5e7e-27a1-4ca2-a04d-588392a0f115\") " pod="openstack/ceilometer-0" Feb 15 17:25:37 crc kubenswrapper[4585]: I0215 17:25:37.720709 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 15 17:25:38 crc kubenswrapper[4585]: I0215 17:25:38.251469 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 15 17:25:38 crc kubenswrapper[4585]: W0215 17:25:38.251840 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode37c5e7e_27a1_4ca2_a04d_588392a0f115.slice/crio-d019580b2fdf03cbe1cc9d63ca82c35503192c18e9a1b8165b3ab0a874f11d53 WatchSource:0}: Error finding container d019580b2fdf03cbe1cc9d63ca82c35503192c18e9a1b8165b3ab0a874f11d53: Status 404 returned error can't find the container with id d019580b2fdf03cbe1cc9d63ca82c35503192c18e9a1b8165b3ab0a874f11d53 Feb 15 17:25:38 crc kubenswrapper[4585]: I0215 17:25:38.323413 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61a2428b-0f64-47db-b464-cc21bac70b83","Type":"ContainerStarted","Data":"c71e57904df5aa6ebee52a17d80afff6e2cfe8c6eb2ae34d6023f58852bcdf3b"} Feb 15 17:25:38 crc kubenswrapper[4585]: I0215 17:25:38.323455 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61a2428b-0f64-47db-b464-cc21bac70b83","Type":"ContainerStarted","Data":"6d24b22f30f73342c75bd4b29cd16331009bd0dd1b68226355ddd9ac536c5c3c"} Feb 15 17:25:38 crc kubenswrapper[4585]: I0215 17:25:38.323467 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"61a2428b-0f64-47db-b464-cc21bac70b83","Type":"ContainerStarted","Data":"d3f79f210d8ef9c3760a846a88f83567941f3e3de68457090bef51fd648e78a7"} Feb 15 17:25:38 crc kubenswrapper[4585]: I0215 17:25:38.325866 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e37c5e7e-27a1-4ca2-a04d-588392a0f115","Type":"ContainerStarted","Data":"d019580b2fdf03cbe1cc9d63ca82c35503192c18e9a1b8165b3ab0a874f11d53"} Feb 15 17:25:38 crc kubenswrapper[4585]: I0215 17:25:38.341513 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.341491789 podStartE2EDuration="2.341491789s" podCreationTimestamp="2026-02-15 17:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:25:38.337152871 +0000 UTC m=+1194.280561013" watchObservedRunningTime="2026-02-15 17:25:38.341491789 +0000 UTC m=+1194.284899921" Feb 15 17:25:38 crc kubenswrapper[4585]: I0215 17:25:38.869821 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ccde36-8ed0-42b6-8e4c-150a913c8c2e" path="/var/lib/kubelet/pods/e4ccde36-8ed0-42b6-8e4c-150a913c8c2e/volumes" Feb 15 17:25:39 crc kubenswrapper[4585]: I0215 17:25:39.339424 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e37c5e7e-27a1-4ca2-a04d-588392a0f115","Type":"ContainerStarted","Data":"a4eacfc60dfab388a1f9ecb0b58f7411eac333728518e29bf7e8674675c7a7f5"} Feb 15 17:25:39 crc kubenswrapper[4585]: I0215 17:25:39.775809 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-mbrdb" Feb 15 17:25:39 crc kubenswrapper[4585]: I0215 17:25:39.824838 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-45s2p"] Feb 15 17:25:39 crc kubenswrapper[4585]: I0215 17:25:39.825055 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" podUID="02233ef6-8c79-4706-aaff-b246384695b8" containerName="dnsmasq-dns" containerID="cri-o://18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398" gracePeriod=10 Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.336542 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.367828 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e37c5e7e-27a1-4ca2-a04d-588392a0f115","Type":"ContainerStarted","Data":"742ecf39ec11b6c8dd81eff63fd50cc73c3574712a0755199107f5e0179a5d65"} Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.372586 4585 generic.go:334] "Generic (PLEG): container finished" podID="02233ef6-8c79-4706-aaff-b246384695b8" containerID="18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398" exitCode=0 Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.372627 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" event={"ID":"02233ef6-8c79-4706-aaff-b246384695b8","Type":"ContainerDied","Data":"18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398"} Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.372643 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" event={"ID":"02233ef6-8c79-4706-aaff-b246384695b8","Type":"ContainerDied","Data":"9e653db19f3790b2775ba5c5e60d9f15240279ad90129bca62a58bf03c99c0c6"} Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.372659 4585 scope.go:117] "RemoveContainer" containerID="18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.372784 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-45s2p" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.400035 4585 scope.go:117] "RemoveContainer" containerID="ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.427995 4585 scope.go:117] "RemoveContainer" containerID="18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398" Feb 15 17:25:40 crc kubenswrapper[4585]: E0215 17:25:40.430650 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398\": container with ID starting with 18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398 not found: ID does not exist" containerID="18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.430688 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398"} err="failed to get container status \"18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398\": rpc error: code = NotFound desc = could not find container \"18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398\": container with ID starting with 18a9a9d83829a6bc0e1da055cd05cbd10e1bc6f2f6320fefa27aa8291c16a398 not found: ID does not exist" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.430712 4585 scope.go:117] "RemoveContainer" containerID="ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615" Feb 15 17:25:40 crc kubenswrapper[4585]: E0215 17:25:40.435892 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615\": container with ID starting with ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615 not found: ID does not exist" containerID="ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.435948 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615"} err="failed to get container status \"ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615\": rpc error: code = NotFound desc = could not find container \"ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615\": container with ID starting with ccd52bff9aeeb46d96f2af69f20259ffa99ffa44087a3326eaacdc524a96f615 not found: ID does not exist" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.443481 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-config\") pod \"02233ef6-8c79-4706-aaff-b246384695b8\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.443522 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-sb\") pod \"02233ef6-8c79-4706-aaff-b246384695b8\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.443570 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-swift-storage-0\") pod \"02233ef6-8c79-4706-aaff-b246384695b8\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.443651 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-svc\") pod \"02233ef6-8c79-4706-aaff-b246384695b8\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.443678 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-nb\") pod \"02233ef6-8c79-4706-aaff-b246384695b8\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.443696 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqhsd\" (UniqueName: \"kubernetes.io/projected/02233ef6-8c79-4706-aaff-b246384695b8-kube-api-access-fqhsd\") pod \"02233ef6-8c79-4706-aaff-b246384695b8\" (UID: \"02233ef6-8c79-4706-aaff-b246384695b8\") " Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.452878 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02233ef6-8c79-4706-aaff-b246384695b8-kube-api-access-fqhsd" (OuterVolumeSpecName: "kube-api-access-fqhsd") pod "02233ef6-8c79-4706-aaff-b246384695b8" (UID: "02233ef6-8c79-4706-aaff-b246384695b8"). InnerVolumeSpecName "kube-api-access-fqhsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.519412 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-config" (OuterVolumeSpecName: "config") pod "02233ef6-8c79-4706-aaff-b246384695b8" (UID: "02233ef6-8c79-4706-aaff-b246384695b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.545434 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-config\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.545460 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqhsd\" (UniqueName: \"kubernetes.io/projected/02233ef6-8c79-4706-aaff-b246384695b8-kube-api-access-fqhsd\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.582041 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "02233ef6-8c79-4706-aaff-b246384695b8" (UID: "02233ef6-8c79-4706-aaff-b246384695b8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.607081 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "02233ef6-8c79-4706-aaff-b246384695b8" (UID: "02233ef6-8c79-4706-aaff-b246384695b8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.647948 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.647980 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.655225 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02233ef6-8c79-4706-aaff-b246384695b8" (UID: "02233ef6-8c79-4706-aaff-b246384695b8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.656181 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "02233ef6-8c79-4706-aaff-b246384695b8" (UID: "02233ef6-8c79-4706-aaff-b246384695b8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.749378 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.749406 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02233ef6-8c79-4706-aaff-b246384695b8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.766362 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-45s2p"] Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.775505 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-45s2p"] Feb 15 17:25:40 crc kubenswrapper[4585]: I0215 17:25:40.852817 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02233ef6-8c79-4706-aaff-b246384695b8" path="/var/lib/kubelet/pods/02233ef6-8c79-4706-aaff-b246384695b8/volumes" Feb 15 17:25:41 crc kubenswrapper[4585]: I0215 17:25:41.384664 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e37c5e7e-27a1-4ca2-a04d-588392a0f115","Type":"ContainerStarted","Data":"56d4a2532829de476dfe03ddee45e2bd1cca2e24311d5b324894775727fed70a"} Feb 15 17:25:42 crc kubenswrapper[4585]: I0215 17:25:42.398948 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e37c5e7e-27a1-4ca2-a04d-588392a0f115","Type":"ContainerStarted","Data":"8f2ab1c0fde3398a7bfd24f450d2565247c2bdb0b8db9b004ef6e3b41d7ab619"} Feb 15 17:25:42 crc kubenswrapper[4585]: I0215 17:25:42.399555 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 15 17:25:42 crc kubenswrapper[4585]: I0215 17:25:42.436397 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.101203421 podStartE2EDuration="5.436379038s" podCreationTimestamp="2026-02-15 17:25:37 +0000 UTC" firstStartedPulling="2026-02-15 17:25:38.255343059 +0000 UTC m=+1194.198751191" lastFinishedPulling="2026-02-15 17:25:41.590518676 +0000 UTC m=+1197.533926808" observedRunningTime="2026-02-15 17:25:42.419468549 +0000 UTC m=+1198.362876671" watchObservedRunningTime="2026-02-15 17:25:42.436379038 +0000 UTC m=+1198.379787180" Feb 15 17:25:46 crc kubenswrapper[4585]: I0215 17:25:46.787009 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 15 17:25:46 crc kubenswrapper[4585]: I0215 17:25:46.787702 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 15 17:25:47 crc kubenswrapper[4585]: I0215 17:25:47.811106 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="61a2428b-0f64-47db-b464-cc21bac70b83" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 15 17:25:47 crc kubenswrapper[4585]: I0215 17:25:47.811312 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="61a2428b-0f64-47db-b464-cc21bac70b83" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.235:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 15 17:25:56 crc kubenswrapper[4585]: I0215 17:25:56.800355 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 15 17:25:56 crc kubenswrapper[4585]: I0215 17:25:56.801525 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 15 17:25:56 crc kubenswrapper[4585]: I0215 17:25:56.811533 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 15 17:25:56 crc kubenswrapper[4585]: I0215 17:25:56.831405 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 15 17:25:57 crc kubenswrapper[4585]: I0215 17:25:57.620251 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 15 17:25:57 crc kubenswrapper[4585]: I0215 17:25:57.683958 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 15 17:26:07 crc kubenswrapper[4585]: I0215 17:26:07.742567 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 15 17:26:47 crc kubenswrapper[4585]: I0215 17:26:47.014446 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:26:47 crc kubenswrapper[4585]: I0215 17:26:47.015175 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:27:17 crc kubenswrapper[4585]: I0215 17:27:17.014275 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:27:17 crc kubenswrapper[4585]: I0215 17:27:17.014888 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:27:47 crc kubenswrapper[4585]: I0215 17:27:47.014166 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:27:47 crc kubenswrapper[4585]: I0215 17:27:47.015010 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:27:47 crc kubenswrapper[4585]: I0215 17:27:47.015073 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:27:47 crc kubenswrapper[4585]: I0215 17:27:47.015762 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc021170b16c2c23261c62b89393adc6ccb7098259eaf76788eecda62fff1dea"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:27:47 crc kubenswrapper[4585]: I0215 17:27:47.015814 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://cc021170b16c2c23261c62b89393adc6ccb7098259eaf76788eecda62fff1dea" gracePeriod=600 Feb 15 17:27:47 crc kubenswrapper[4585]: I0215 17:27:47.985196 4585 scope.go:117] "RemoveContainer" containerID="a6634c0ef1a1562948706cf432145b94cb95b18b28a2970776b677077b76b153" Feb 15 17:27:48 crc kubenswrapper[4585]: I0215 17:27:48.011472 4585 scope.go:117] "RemoveContainer" containerID="362dd094c9efa09227c6baee153a70b11e42b2e27e174b9f80c8edb9192ffcf7" Feb 15 17:27:48 crc kubenswrapper[4585]: I0215 17:27:48.080891 4585 scope.go:117] "RemoveContainer" containerID="445760de021578cfbcb54d581b7cb9c8735b50399d6b0c66343381a2dff920a7" Feb 15 17:27:48 crc kubenswrapper[4585]: I0215 17:27:48.128739 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="cc021170b16c2c23261c62b89393adc6ccb7098259eaf76788eecda62fff1dea" exitCode=0 Feb 15 17:27:48 crc kubenswrapper[4585]: I0215 17:27:48.128791 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"cc021170b16c2c23261c62b89393adc6ccb7098259eaf76788eecda62fff1dea"} Feb 15 17:27:48 crc kubenswrapper[4585]: I0215 17:27:48.128869 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d"} Feb 15 17:27:48 crc kubenswrapper[4585]: I0215 17:27:48.128937 4585 scope.go:117] "RemoveContainer" containerID="6f05a1d156be3c851f680780cf5a4d67dc38d38043f804ea0899d1efe3927d68" Feb 15 17:27:48 crc kubenswrapper[4585]: I0215 17:27:48.133673 4585 scope.go:117] "RemoveContainer" containerID="d8e5e4394e1c825f0599f2cf17af2ea8fe55d5ff071f368d7c7fcc9f120dd3e6" Feb 15 17:28:48 crc kubenswrapper[4585]: I0215 17:28:48.249568 4585 scope.go:117] "RemoveContainer" containerID="516a444534b85d854510fea82808ee1e24f78fb82f4510d73ab0069c17f25241" Feb 15 17:29:47 crc kubenswrapper[4585]: I0215 17:29:47.014813 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:29:47 crc kubenswrapper[4585]: I0215 17:29:47.015460 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.182818 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv"] Feb 15 17:30:00 crc kubenswrapper[4585]: E0215 17:30:00.183937 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02233ef6-8c79-4706-aaff-b246384695b8" containerName="init" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.183953 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="02233ef6-8c79-4706-aaff-b246384695b8" containerName="init" Feb 15 17:30:00 crc kubenswrapper[4585]: E0215 17:30:00.183994 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02233ef6-8c79-4706-aaff-b246384695b8" containerName="dnsmasq-dns" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.184001 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="02233ef6-8c79-4706-aaff-b246384695b8" containerName="dnsmasq-dns" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.184219 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="02233ef6-8c79-4706-aaff-b246384695b8" containerName="dnsmasq-dns" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.184899 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.186616 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.188027 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.226036 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv"] Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.305887 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9gvh\" (UniqueName: \"kubernetes.io/projected/e90b20c5-0f18-4778-beeb-b011c4655d17-kube-api-access-s9gvh\") pod \"collect-profiles-29519610-9zgmv\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.305961 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e90b20c5-0f18-4778-beeb-b011c4655d17-secret-volume\") pod \"collect-profiles-29519610-9zgmv\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.306140 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e90b20c5-0f18-4778-beeb-b011c4655d17-config-volume\") pod \"collect-profiles-29519610-9zgmv\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.408347 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9gvh\" (UniqueName: \"kubernetes.io/projected/e90b20c5-0f18-4778-beeb-b011c4655d17-kube-api-access-s9gvh\") pod \"collect-profiles-29519610-9zgmv\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.408690 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e90b20c5-0f18-4778-beeb-b011c4655d17-secret-volume\") pod \"collect-profiles-29519610-9zgmv\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.408930 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e90b20c5-0f18-4778-beeb-b011c4655d17-config-volume\") pod \"collect-profiles-29519610-9zgmv\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.409953 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e90b20c5-0f18-4778-beeb-b011c4655d17-config-volume\") pod \"collect-profiles-29519610-9zgmv\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.420258 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e90b20c5-0f18-4778-beeb-b011c4655d17-secret-volume\") pod \"collect-profiles-29519610-9zgmv\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.423507 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9gvh\" (UniqueName: \"kubernetes.io/projected/e90b20c5-0f18-4778-beeb-b011c4655d17-kube-api-access-s9gvh\") pod \"collect-profiles-29519610-9zgmv\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:00 crc kubenswrapper[4585]: I0215 17:30:00.510792 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:01 crc kubenswrapper[4585]: I0215 17:30:01.020429 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv"] Feb 15 17:30:01 crc kubenswrapper[4585]: I0215 17:30:01.764386 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" event={"ID":"e90b20c5-0f18-4778-beeb-b011c4655d17","Type":"ContainerStarted","Data":"387e32b877556e08c51e54584bed561ad6df76b2ccf869581cd6bc6ec75648fe"} Feb 15 17:30:01 crc kubenswrapper[4585]: I0215 17:30:01.766025 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" event={"ID":"e90b20c5-0f18-4778-beeb-b011c4655d17","Type":"ContainerStarted","Data":"34e1263884a928f7e9e66c0c4443cf4bd3a063bc7030a218a934da5b98d9944d"} Feb 15 17:30:01 crc kubenswrapper[4585]: I0215 17:30:01.788437 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" podStartSLOduration=1.7884183139999998 podStartE2EDuration="1.788418314s" podCreationTimestamp="2026-02-15 17:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 17:30:01.783778438 +0000 UTC m=+1457.727186580" watchObservedRunningTime="2026-02-15 17:30:01.788418314 +0000 UTC m=+1457.731826446" Feb 15 17:30:02 crc kubenswrapper[4585]: I0215 17:30:02.781945 4585 generic.go:334] "Generic (PLEG): container finished" podID="e90b20c5-0f18-4778-beeb-b011c4655d17" containerID="387e32b877556e08c51e54584bed561ad6df76b2ccf869581cd6bc6ec75648fe" exitCode=0 Feb 15 17:30:02 crc kubenswrapper[4585]: I0215 17:30:02.781984 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" event={"ID":"e90b20c5-0f18-4778-beeb-b011c4655d17","Type":"ContainerDied","Data":"387e32b877556e08c51e54584bed561ad6df76b2ccf869581cd6bc6ec75648fe"} Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.243183 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.400469 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e90b20c5-0f18-4778-beeb-b011c4655d17-config-volume\") pod \"e90b20c5-0f18-4778-beeb-b011c4655d17\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.400673 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9gvh\" (UniqueName: \"kubernetes.io/projected/e90b20c5-0f18-4778-beeb-b011c4655d17-kube-api-access-s9gvh\") pod \"e90b20c5-0f18-4778-beeb-b011c4655d17\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.400738 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e90b20c5-0f18-4778-beeb-b011c4655d17-secret-volume\") pod \"e90b20c5-0f18-4778-beeb-b011c4655d17\" (UID: \"e90b20c5-0f18-4778-beeb-b011c4655d17\") " Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.401279 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e90b20c5-0f18-4778-beeb-b011c4655d17-config-volume" (OuterVolumeSpecName: "config-volume") pod "e90b20c5-0f18-4778-beeb-b011c4655d17" (UID: "e90b20c5-0f18-4778-beeb-b011c4655d17"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.406690 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e90b20c5-0f18-4778-beeb-b011c4655d17-kube-api-access-s9gvh" (OuterVolumeSpecName: "kube-api-access-s9gvh") pod "e90b20c5-0f18-4778-beeb-b011c4655d17" (UID: "e90b20c5-0f18-4778-beeb-b011c4655d17"). InnerVolumeSpecName "kube-api-access-s9gvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.406874 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e90b20c5-0f18-4778-beeb-b011c4655d17-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e90b20c5-0f18-4778-beeb-b011c4655d17" (UID: "e90b20c5-0f18-4778-beeb-b011c4655d17"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.503298 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e90b20c5-0f18-4778-beeb-b011c4655d17-config-volume\") on node \"crc\" DevicePath \"\"" Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.503485 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9gvh\" (UniqueName: \"kubernetes.io/projected/e90b20c5-0f18-4778-beeb-b011c4655d17-kube-api-access-s9gvh\") on node \"crc\" DevicePath \"\"" Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.503543 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e90b20c5-0f18-4778-beeb-b011c4655d17-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.835964 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" event={"ID":"e90b20c5-0f18-4778-beeb-b011c4655d17","Type":"ContainerDied","Data":"34e1263884a928f7e9e66c0c4443cf4bd3a063bc7030a218a934da5b98d9944d"} Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.836000 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e1263884a928f7e9e66c0c4443cf4bd3a063bc7030a218a934da5b98d9944d" Feb 15 17:30:04 crc kubenswrapper[4585]: I0215 17:30:04.836051 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519610-9zgmv" Feb 15 17:30:17 crc kubenswrapper[4585]: I0215 17:30:17.014134 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:30:17 crc kubenswrapper[4585]: I0215 17:30:17.014561 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:30:47 crc kubenswrapper[4585]: I0215 17:30:47.014344 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:30:47 crc kubenswrapper[4585]: I0215 17:30:47.014804 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:30:47 crc kubenswrapper[4585]: I0215 17:30:47.014851 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:30:47 crc kubenswrapper[4585]: I0215 17:30:47.015579 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:30:47 crc kubenswrapper[4585]: I0215 17:30:47.015636 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" gracePeriod=600 Feb 15 17:30:47 crc kubenswrapper[4585]: E0215 17:30:47.141840 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:30:47 crc kubenswrapper[4585]: I0215 17:30:47.243928 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" exitCode=0 Feb 15 17:30:47 crc kubenswrapper[4585]: I0215 17:30:47.243976 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d"} Feb 15 17:30:47 crc kubenswrapper[4585]: I0215 17:30:47.244013 4585 scope.go:117] "RemoveContainer" containerID="cc021170b16c2c23261c62b89393adc6ccb7098259eaf76788eecda62fff1dea" Feb 15 17:30:47 crc kubenswrapper[4585]: I0215 17:30:47.244877 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:30:47 crc kubenswrapper[4585]: E0215 17:30:47.245343 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:30:52 crc kubenswrapper[4585]: I0215 17:30:52.057001 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-w957z"] Feb 15 17:30:52 crc kubenswrapper[4585]: I0215 17:30:52.066748 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-w957z"] Feb 15 17:30:52 crc kubenswrapper[4585]: I0215 17:30:52.855994 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab2a2c2-523c-43c3-9776-902776a34ad4" path="/var/lib/kubelet/pods/7ab2a2c2-523c-43c3-9776-902776a34ad4/volumes" Feb 15 17:30:53 crc kubenswrapper[4585]: I0215 17:30:53.040513 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b1da-account-create-update-nqw28"] Feb 15 17:30:53 crc kubenswrapper[4585]: I0215 17:30:53.057317 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rtgm2"] Feb 15 17:30:53 crc kubenswrapper[4585]: I0215 17:30:53.069029 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b1da-account-create-update-nqw28"] Feb 15 17:30:53 crc kubenswrapper[4585]: I0215 17:30:53.080321 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rtgm2"] Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.040362 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2vh2n"] Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.056692 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-fbb4-account-create-update-pc58j"] Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.067752 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1491-account-create-update-f9fqz"] Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.078042 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2vh2n"] Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.086498 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1491-account-create-update-f9fqz"] Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.095443 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-fbb4-account-create-update-pc58j"] Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.863913 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="224730b2-4e83-42a8-b057-11c1ae0fd14f" path="/var/lib/kubelet/pods/224730b2-4e83-42a8-b057-11c1ae0fd14f/volumes" Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.865482 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8916579d-e13d-4087-8c29-41a8ec210f9a" path="/var/lib/kubelet/pods/8916579d-e13d-4087-8c29-41a8ec210f9a/volumes" Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.869321 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94c30da-2a32-4256-a0a5-13c6c7a54725" path="/var/lib/kubelet/pods/c94c30da-2a32-4256-a0a5-13c6c7a54725/volumes" Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.871799 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26f1cb2-7405-4d6e-bbde-c172c4bc32b2" path="/var/lib/kubelet/pods/e26f1cb2-7405-4d6e-bbde-c172c4bc32b2/volumes" Feb 15 17:30:54 crc kubenswrapper[4585]: I0215 17:30:54.875902 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece8b5dd-b54b-400c-a62a-4995d15e8763" path="/var/lib/kubelet/pods/ece8b5dd-b54b-400c-a62a-4995d15e8763/volumes" Feb 15 17:31:02 crc kubenswrapper[4585]: I0215 17:31:02.842210 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:31:02 crc kubenswrapper[4585]: E0215 17:31:02.842909 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:31:17 crc kubenswrapper[4585]: I0215 17:31:17.842871 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:31:17 crc kubenswrapper[4585]: E0215 17:31:17.843427 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:31:18 crc kubenswrapper[4585]: I0215 17:31:18.047050 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-f248v"] Feb 15 17:31:18 crc kubenswrapper[4585]: I0215 17:31:18.056168 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-f248v"] Feb 15 17:31:18 crc kubenswrapper[4585]: I0215 17:31:18.859419 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1e0d19-52a5-4959-9e9f-74094993a95c" path="/var/lib/kubelet/pods/0b1e0d19-52a5-4959-9e9f-74094993a95c/volumes" Feb 15 17:31:32 crc kubenswrapper[4585]: I0215 17:31:32.038618 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7nlr5"] Feb 15 17:31:32 crc kubenswrapper[4585]: I0215 17:31:32.052325 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7nlr5"] Feb 15 17:31:32 crc kubenswrapper[4585]: I0215 17:31:32.842364 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:31:32 crc kubenswrapper[4585]: E0215 17:31:32.842921 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:31:32 crc kubenswrapper[4585]: I0215 17:31:32.859357 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d959ed97-3c6f-4503-864e-57104658b927" path="/var/lib/kubelet/pods/d959ed97-3c6f-4503-864e-57104658b927/volumes" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.769303 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lvbkp"] Feb 15 17:31:40 crc kubenswrapper[4585]: E0215 17:31:40.770667 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90b20c5-0f18-4778-beeb-b011c4655d17" containerName="collect-profiles" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.770689 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90b20c5-0f18-4778-beeb-b011c4655d17" containerName="collect-profiles" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.771168 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90b20c5-0f18-4778-beeb-b011c4655d17" containerName="collect-profiles" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.774042 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.778848 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvbkp"] Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.846162 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm49x\" (UniqueName: \"kubernetes.io/projected/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-kube-api-access-dm49x\") pod \"community-operators-lvbkp\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.846562 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-utilities\") pod \"community-operators-lvbkp\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.846706 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-catalog-content\") pod \"community-operators-lvbkp\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.948453 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-utilities\") pod \"community-operators-lvbkp\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.948905 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-utilities\") pod \"community-operators-lvbkp\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.948985 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-catalog-content\") pod \"community-operators-lvbkp\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.949257 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-catalog-content\") pod \"community-operators-lvbkp\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.949607 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm49x\" (UniqueName: \"kubernetes.io/projected/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-kube-api-access-dm49x\") pod \"community-operators-lvbkp\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:40 crc kubenswrapper[4585]: I0215 17:31:40.970066 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm49x\" (UniqueName: \"kubernetes.io/projected/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-kube-api-access-dm49x\") pod \"community-operators-lvbkp\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.036870 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e401-account-create-update-4qk4n"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.046857 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xdr6c"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.057831 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-cg64k"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.066685 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-da70-account-create-update-gvms7"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.075661 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1cf8-account-create-update-d6lrz"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.084916 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e401-account-create-update-4qk4n"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.092693 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1cf8-account-create-update-d6lrz"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.100988 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xdr6c"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.110415 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-cg64k"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.119003 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-da70-account-create-update-gvms7"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.127730 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-nkbd8"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.132945 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.141346 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-nkbd8"] Feb 15 17:31:41 crc kubenswrapper[4585]: I0215 17:31:41.606335 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvbkp"] Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.005760 4585 generic.go:334] "Generic (PLEG): container finished" podID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerID="98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142" exitCode=0 Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.005846 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvbkp" event={"ID":"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6","Type":"ContainerDied","Data":"98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142"} Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.007360 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvbkp" event={"ID":"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6","Type":"ContainerStarted","Data":"f07266ee90f1c5f663e6c3b6403d042279861dabf5923982c8acc5ce1fa056ed"} Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.009332 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.853283 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2724d82d-6eac-48a5-8ea8-4008702ab558" path="/var/lib/kubelet/pods/2724d82d-6eac-48a5-8ea8-4008702ab558/volumes" Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.854361 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b7dbb3-1a0e-47a6-b248-039a22229706" path="/var/lib/kubelet/pods/38b7dbb3-1a0e-47a6-b248-039a22229706/volumes" Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.855292 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f8a86e-f29d-4ab2-ae55-762f90fdd6ad" path="/var/lib/kubelet/pods/47f8a86e-f29d-4ab2-ae55-762f90fdd6ad/volumes" Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.856382 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a35fd8-ae4c-4c30-a41b-0b1413b82ef4" path="/var/lib/kubelet/pods/49a35fd8-ae4c-4c30-a41b-0b1413b82ef4/volumes" Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.857021 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792430d2-7a82-48c8-b091-fa2875ded5e8" path="/var/lib/kubelet/pods/792430d2-7a82-48c8-b091-fa2875ded5e8/volumes" Feb 15 17:31:42 crc kubenswrapper[4585]: I0215 17:31:42.857584 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3f97f63-b51e-4adf-8ec3-1d2aefc4c476" path="/var/lib/kubelet/pods/f3f97f63-b51e-4adf-8ec3-1d2aefc4c476/volumes" Feb 15 17:31:43 crc kubenswrapper[4585]: I0215 17:31:43.025780 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvbkp" event={"ID":"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6","Type":"ContainerStarted","Data":"00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719"} Feb 15 17:31:45 crc kubenswrapper[4585]: I0215 17:31:45.047354 4585 generic.go:334] "Generic (PLEG): container finished" podID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerID="00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719" exitCode=0 Feb 15 17:31:45 crc kubenswrapper[4585]: I0215 17:31:45.047438 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvbkp" event={"ID":"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6","Type":"ContainerDied","Data":"00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719"} Feb 15 17:31:46 crc kubenswrapper[4585]: I0215 17:31:46.068135 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvbkp" event={"ID":"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6","Type":"ContainerStarted","Data":"5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704"} Feb 15 17:31:47 crc kubenswrapper[4585]: I0215 17:31:47.842318 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:31:47 crc kubenswrapper[4585]: E0215 17:31:47.842950 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.431042 4585 scope.go:117] "RemoveContainer" containerID="5868f8510d0aae4b161978a06a88f463f53b293dc09a0fd1522fa53227a705d8" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.467328 4585 scope.go:117] "RemoveContainer" containerID="61818f492a085cfc2f7bae53c0cae6feee90e669bbabf656eef07532e41f8901" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.517404 4585 scope.go:117] "RemoveContainer" containerID="04d960b801b51e4c7dcf70018d28226345910bcd3c4ca3d02660bd3e81871f31" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.586515 4585 scope.go:117] "RemoveContainer" containerID="3c88581b6d975449e8fd5787046dbf15c976a17261ef585a85565ea999732c8a" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.606358 4585 scope.go:117] "RemoveContainer" containerID="3d38d5d04daecc02733746c64928e47c31b78bf58d8e657112b54ac302d83cd1" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.658786 4585 scope.go:117] "RemoveContainer" containerID="af9c794c4169f98d81ca0d6f6d4ff4e1c56217fa801b953e97b3e4902c982cef" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.713938 4585 scope.go:117] "RemoveContainer" containerID="fd5dcb6d224d611aa5492d2aa912076a195cb4261bcae88d4d6cf47ffad335ac" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.750888 4585 scope.go:117] "RemoveContainer" containerID="a113f417811566209ff92b27bc2faec207eb4ff9fabbbfb28ee4d56728279113" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.780494 4585 scope.go:117] "RemoveContainer" containerID="4750d1ecd741fbda12fb7fb1a39c13166055c77a7c78006eb78b482e6bd7d111" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.800810 4585 scope.go:117] "RemoveContainer" containerID="1133775a544ecd10893348bd244e644d535aafa7205b58a646fb768055cab0c3" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.824707 4585 scope.go:117] "RemoveContainer" containerID="506451b50970b852b7fc0b12b953d97fe283737f32279dbe24de47e4899e1b1e" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.845837 4585 scope.go:117] "RemoveContainer" containerID="6ed929907234f3bf0f8d200fc28e94a9714b1d0fd2a54847a6fb170ec14504b7" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.875039 4585 scope.go:117] "RemoveContainer" containerID="8e4844b141caedf8bcbbb89a0019b89351c81bc8f01dc6a6c17894a433d2fc98" Feb 15 17:31:48 crc kubenswrapper[4585]: I0215 17:31:48.901193 4585 scope.go:117] "RemoveContainer" containerID="c6e2eeb157b98e31a26fb4f504ef0d56b2f0e8954627bf92596f438ac46a0495" Feb 15 17:31:49 crc kubenswrapper[4585]: I0215 17:31:49.030834 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lvbkp" podStartSLOduration=5.622024936 podStartE2EDuration="9.0308188s" podCreationTimestamp="2026-02-15 17:31:40 +0000 UTC" firstStartedPulling="2026-02-15 17:31:42.009144501 +0000 UTC m=+1557.952552633" lastFinishedPulling="2026-02-15 17:31:45.417938365 +0000 UTC m=+1561.361346497" observedRunningTime="2026-02-15 17:31:46.101066096 +0000 UTC m=+1562.044474268" watchObservedRunningTime="2026-02-15 17:31:49.0308188 +0000 UTC m=+1564.974226932" Feb 15 17:31:49 crc kubenswrapper[4585]: I0215 17:31:49.035926 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-cg995"] Feb 15 17:31:49 crc kubenswrapper[4585]: I0215 17:31:49.048932 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-cg995"] Feb 15 17:31:50 crc kubenswrapper[4585]: I0215 17:31:50.862276 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f71e9d8-c516-41bf-89de-ddd7d51519f6" path="/var/lib/kubelet/pods/5f71e9d8-c516-41bf-89de-ddd7d51519f6/volumes" Feb 15 17:31:51 crc kubenswrapper[4585]: I0215 17:31:51.134172 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:51 crc kubenswrapper[4585]: I0215 17:31:51.134219 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:51 crc kubenswrapper[4585]: I0215 17:31:51.196197 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:52 crc kubenswrapper[4585]: I0215 17:31:52.232579 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:52 crc kubenswrapper[4585]: I0215 17:31:52.311506 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvbkp"] Feb 15 17:31:54 crc kubenswrapper[4585]: I0215 17:31:54.178788 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lvbkp" podUID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerName="registry-server" containerID="cri-o://5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704" gracePeriod=2 Feb 15 17:31:54 crc kubenswrapper[4585]: I0215 17:31:54.741888 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:54 crc kubenswrapper[4585]: I0215 17:31:54.897864 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm49x\" (UniqueName: \"kubernetes.io/projected/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-kube-api-access-dm49x\") pod \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " Feb 15 17:31:54 crc kubenswrapper[4585]: I0215 17:31:54.897983 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-catalog-content\") pod \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " Feb 15 17:31:54 crc kubenswrapper[4585]: I0215 17:31:54.898180 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-utilities\") pod \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\" (UID: \"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6\") " Feb 15 17:31:54 crc kubenswrapper[4585]: I0215 17:31:54.899740 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-utilities" (OuterVolumeSpecName: "utilities") pod "fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" (UID: "fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:31:54 crc kubenswrapper[4585]: I0215 17:31:54.904388 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-kube-api-access-dm49x" (OuterVolumeSpecName: "kube-api-access-dm49x") pod "fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" (UID: "fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6"). InnerVolumeSpecName "kube-api-access-dm49x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:31:54 crc kubenswrapper[4585]: I0215 17:31:54.958001 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" (UID: "fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.000765 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm49x\" (UniqueName: \"kubernetes.io/projected/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-kube-api-access-dm49x\") on node \"crc\" DevicePath \"\"" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.001035 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.001045 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.192913 4585 generic.go:334] "Generic (PLEG): container finished" podID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerID="5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704" exitCode=0 Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.192954 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvbkp" event={"ID":"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6","Type":"ContainerDied","Data":"5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704"} Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.192978 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvbkp" event={"ID":"fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6","Type":"ContainerDied","Data":"f07266ee90f1c5f663e6c3b6403d042279861dabf5923982c8acc5ce1fa056ed"} Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.192996 4585 scope.go:117] "RemoveContainer" containerID="5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.193115 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvbkp" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.236912 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvbkp"] Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.242773 4585 scope.go:117] "RemoveContainer" containerID="00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.250259 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lvbkp"] Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.278019 4585 scope.go:117] "RemoveContainer" containerID="98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.336106 4585 scope.go:117] "RemoveContainer" containerID="5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704" Feb 15 17:31:55 crc kubenswrapper[4585]: E0215 17:31:55.336629 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704\": container with ID starting with 5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704 not found: ID does not exist" containerID="5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.336659 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704"} err="failed to get container status \"5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704\": rpc error: code = NotFound desc = could not find container \"5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704\": container with ID starting with 5bb981e1a633df6a7bac27297e59a0f7aa068238d2c90cded68f4d13c0b21704 not found: ID does not exist" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.336679 4585 scope.go:117] "RemoveContainer" containerID="00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719" Feb 15 17:31:55 crc kubenswrapper[4585]: E0215 17:31:55.337457 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719\": container with ID starting with 00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719 not found: ID does not exist" containerID="00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.337507 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719"} err="failed to get container status \"00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719\": rpc error: code = NotFound desc = could not find container \"00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719\": container with ID starting with 00e93d6c4fbb5771613ade632d77183a38f25813a959aba320fb0b06c8aac719 not found: ID does not exist" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.337539 4585 scope.go:117] "RemoveContainer" containerID="98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142" Feb 15 17:31:55 crc kubenswrapper[4585]: E0215 17:31:55.338104 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142\": container with ID starting with 98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142 not found: ID does not exist" containerID="98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142" Feb 15 17:31:55 crc kubenswrapper[4585]: I0215 17:31:55.338132 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142"} err="failed to get container status \"98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142\": rpc error: code = NotFound desc = could not find container \"98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142\": container with ID starting with 98b954fe9d45afb6c8b3748d83c65680999fce497e1808742cf3984fc019a142 not found: ID does not exist" Feb 15 17:31:56 crc kubenswrapper[4585]: I0215 17:31:56.868076 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" path="/var/lib/kubelet/pods/fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6/volumes" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.220078 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-598xh"] Feb 15 17:31:58 crc kubenswrapper[4585]: E0215 17:31:58.220761 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerName="registry-server" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.220777 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerName="registry-server" Feb 15 17:31:58 crc kubenswrapper[4585]: E0215 17:31:58.220840 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerName="extract-content" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.220849 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerName="extract-content" Feb 15 17:31:58 crc kubenswrapper[4585]: E0215 17:31:58.220871 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerName="extract-utilities" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.220880 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerName="extract-utilities" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.221210 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7d87aa-0a2e-4bd7-8b2a-5ca470947db6" containerName="registry-server" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.223294 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.237502 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-598xh"] Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.383094 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-utilities\") pod \"redhat-operators-598xh\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.383186 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdsmv\" (UniqueName: \"kubernetes.io/projected/047b2db9-40e0-48d7-9525-88aef9590848-kube-api-access-sdsmv\") pod \"redhat-operators-598xh\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.383562 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-catalog-content\") pod \"redhat-operators-598xh\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.485604 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-catalog-content\") pod \"redhat-operators-598xh\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.485695 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-utilities\") pod \"redhat-operators-598xh\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.485731 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdsmv\" (UniqueName: \"kubernetes.io/projected/047b2db9-40e0-48d7-9525-88aef9590848-kube-api-access-sdsmv\") pod \"redhat-operators-598xh\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.486369 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-catalog-content\") pod \"redhat-operators-598xh\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.489993 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-utilities\") pod \"redhat-operators-598xh\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.524393 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdsmv\" (UniqueName: \"kubernetes.io/projected/047b2db9-40e0-48d7-9525-88aef9590848-kube-api-access-sdsmv\") pod \"redhat-operators-598xh\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:58 crc kubenswrapper[4585]: I0215 17:31:58.597168 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:31:59 crc kubenswrapper[4585]: I0215 17:31:59.125163 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-598xh"] Feb 15 17:31:59 crc kubenswrapper[4585]: I0215 17:31:59.255534 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-598xh" event={"ID":"047b2db9-40e0-48d7-9525-88aef9590848","Type":"ContainerStarted","Data":"4ab54b2d12894acf1f8b9ad03bcc6d6b98f2b34e797cb9d95b58a53520c044a6"} Feb 15 17:31:59 crc kubenswrapper[4585]: I0215 17:31:59.842059 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:31:59 crc kubenswrapper[4585]: E0215 17:31:59.842524 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:32:00 crc kubenswrapper[4585]: I0215 17:32:00.268548 4585 generic.go:334] "Generic (PLEG): container finished" podID="047b2db9-40e0-48d7-9525-88aef9590848" containerID="7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736" exitCode=0 Feb 15 17:32:00 crc kubenswrapper[4585]: I0215 17:32:00.268620 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-598xh" event={"ID":"047b2db9-40e0-48d7-9525-88aef9590848","Type":"ContainerDied","Data":"7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736"} Feb 15 17:32:01 crc kubenswrapper[4585]: I0215 17:32:01.288251 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-598xh" event={"ID":"047b2db9-40e0-48d7-9525-88aef9590848","Type":"ContainerStarted","Data":"6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27"} Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.614984 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wbpm8"] Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.618311 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.627239 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbpm8"] Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.733293 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgr5v\" (UniqueName: \"kubernetes.io/projected/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-kube-api-access-cgr5v\") pod \"redhat-marketplace-wbpm8\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.733401 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-catalog-content\") pod \"redhat-marketplace-wbpm8\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.733451 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-utilities\") pod \"redhat-marketplace-wbpm8\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.835730 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgr5v\" (UniqueName: \"kubernetes.io/projected/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-kube-api-access-cgr5v\") pod \"redhat-marketplace-wbpm8\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.835851 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-catalog-content\") pod \"redhat-marketplace-wbpm8\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.835897 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-utilities\") pod \"redhat-marketplace-wbpm8\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.836386 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-catalog-content\") pod \"redhat-marketplace-wbpm8\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.836465 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-utilities\") pod \"redhat-marketplace-wbpm8\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.859680 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgr5v\" (UniqueName: \"kubernetes.io/projected/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-kube-api-access-cgr5v\") pod \"redhat-marketplace-wbpm8\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:05 crc kubenswrapper[4585]: I0215 17:32:05.938262 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:06 crc kubenswrapper[4585]: I0215 17:32:06.356757 4585 generic.go:334] "Generic (PLEG): container finished" podID="047b2db9-40e0-48d7-9525-88aef9590848" containerID="6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27" exitCode=0 Feb 15 17:32:06 crc kubenswrapper[4585]: I0215 17:32:06.356838 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-598xh" event={"ID":"047b2db9-40e0-48d7-9525-88aef9590848","Type":"ContainerDied","Data":"6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27"} Feb 15 17:32:06 crc kubenswrapper[4585]: I0215 17:32:06.490748 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbpm8"] Feb 15 17:32:07 crc kubenswrapper[4585]: I0215 17:32:07.368537 4585 generic.go:334] "Generic (PLEG): container finished" podID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerID="096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4" exitCode=0 Feb 15 17:32:07 crc kubenswrapper[4585]: I0215 17:32:07.368902 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbpm8" event={"ID":"54cbb2ea-4b23-4d14-a1b3-9272c7da49be","Type":"ContainerDied","Data":"096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4"} Feb 15 17:32:07 crc kubenswrapper[4585]: I0215 17:32:07.368932 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbpm8" event={"ID":"54cbb2ea-4b23-4d14-a1b3-9272c7da49be","Type":"ContainerStarted","Data":"a20c6638dcb69e5ee419242513203255f61f730a56265161738d641320ea543a"} Feb 15 17:32:07 crc kubenswrapper[4585]: I0215 17:32:07.384377 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-598xh" event={"ID":"047b2db9-40e0-48d7-9525-88aef9590848","Type":"ContainerStarted","Data":"d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661"} Feb 15 17:32:07 crc kubenswrapper[4585]: I0215 17:32:07.416622 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-598xh" podStartSLOduration=2.689324124 podStartE2EDuration="9.416551237s" podCreationTimestamp="2026-02-15 17:31:58 +0000 UTC" firstStartedPulling="2026-02-15 17:32:00.270869979 +0000 UTC m=+1576.214278121" lastFinishedPulling="2026-02-15 17:32:06.998097102 +0000 UTC m=+1582.941505234" observedRunningTime="2026-02-15 17:32:07.405269962 +0000 UTC m=+1583.348678094" watchObservedRunningTime="2026-02-15 17:32:07.416551237 +0000 UTC m=+1583.359959379" Feb 15 17:32:08 crc kubenswrapper[4585]: I0215 17:32:08.395071 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbpm8" event={"ID":"54cbb2ea-4b23-4d14-a1b3-9272c7da49be","Type":"ContainerStarted","Data":"3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf"} Feb 15 17:32:08 crc kubenswrapper[4585]: I0215 17:32:08.597280 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:32:08 crc kubenswrapper[4585]: I0215 17:32:08.597331 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:32:09 crc kubenswrapper[4585]: I0215 17:32:09.416986 4585 generic.go:334] "Generic (PLEG): container finished" podID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerID="3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf" exitCode=0 Feb 15 17:32:09 crc kubenswrapper[4585]: I0215 17:32:09.417168 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbpm8" event={"ID":"54cbb2ea-4b23-4d14-a1b3-9272c7da49be","Type":"ContainerDied","Data":"3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf"} Feb 15 17:32:09 crc kubenswrapper[4585]: I0215 17:32:09.640666 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-598xh" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="registry-server" probeResult="failure" output=< Feb 15 17:32:09 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:32:09 crc kubenswrapper[4585]: > Feb 15 17:32:10 crc kubenswrapper[4585]: I0215 17:32:10.432078 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbpm8" event={"ID":"54cbb2ea-4b23-4d14-a1b3-9272c7da49be","Type":"ContainerStarted","Data":"87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845"} Feb 15 17:32:10 crc kubenswrapper[4585]: I0215 17:32:10.466759 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wbpm8" podStartSLOduration=3.02573035 podStartE2EDuration="5.466739033s" podCreationTimestamp="2026-02-15 17:32:05 +0000 UTC" firstStartedPulling="2026-02-15 17:32:07.372532998 +0000 UTC m=+1583.315941130" lastFinishedPulling="2026-02-15 17:32:09.813541681 +0000 UTC m=+1585.756949813" observedRunningTime="2026-02-15 17:32:10.461047919 +0000 UTC m=+1586.404456051" watchObservedRunningTime="2026-02-15 17:32:10.466739033 +0000 UTC m=+1586.410147165" Feb 15 17:32:14 crc kubenswrapper[4585]: I0215 17:32:14.857460 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:32:14 crc kubenswrapper[4585]: E0215 17:32:14.858179 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:32:15 crc kubenswrapper[4585]: I0215 17:32:15.939735 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:15 crc kubenswrapper[4585]: I0215 17:32:15.939812 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:15 crc kubenswrapper[4585]: I0215 17:32:15.996978 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:16 crc kubenswrapper[4585]: I0215 17:32:16.546949 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:16 crc kubenswrapper[4585]: I0215 17:32:16.611861 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbpm8"] Feb 15 17:32:18 crc kubenswrapper[4585]: I0215 17:32:18.539121 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wbpm8" podUID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerName="registry-server" containerID="cri-o://87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845" gracePeriod=2 Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.015012 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.152000 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-catalog-content\") pod \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.152541 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-utilities\") pod \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.152656 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgr5v\" (UniqueName: \"kubernetes.io/projected/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-kube-api-access-cgr5v\") pod \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\" (UID: \"54cbb2ea-4b23-4d14-a1b3-9272c7da49be\") " Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.156273 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-utilities" (OuterVolumeSpecName: "utilities") pod "54cbb2ea-4b23-4d14-a1b3-9272c7da49be" (UID: "54cbb2ea-4b23-4d14-a1b3-9272c7da49be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.159856 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-kube-api-access-cgr5v" (OuterVolumeSpecName: "kube-api-access-cgr5v") pod "54cbb2ea-4b23-4d14-a1b3-9272c7da49be" (UID: "54cbb2ea-4b23-4d14-a1b3-9272c7da49be"). InnerVolumeSpecName "kube-api-access-cgr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.196898 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54cbb2ea-4b23-4d14-a1b3-9272c7da49be" (UID: "54cbb2ea-4b23-4d14-a1b3-9272c7da49be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.255537 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.255568 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgr5v\" (UniqueName: \"kubernetes.io/projected/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-kube-api-access-cgr5v\") on node \"crc\" DevicePath \"\"" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.255583 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54cbb2ea-4b23-4d14-a1b3-9272c7da49be-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.554630 4585 generic.go:334] "Generic (PLEG): container finished" podID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerID="87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845" exitCode=0 Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.554715 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbpm8" event={"ID":"54cbb2ea-4b23-4d14-a1b3-9272c7da49be","Type":"ContainerDied","Data":"87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845"} Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.555082 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbpm8" event={"ID":"54cbb2ea-4b23-4d14-a1b3-9272c7da49be","Type":"ContainerDied","Data":"a20c6638dcb69e5ee419242513203255f61f730a56265161738d641320ea543a"} Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.555121 4585 scope.go:117] "RemoveContainer" containerID="87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.554750 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbpm8" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.605457 4585 scope.go:117] "RemoveContainer" containerID="3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.612248 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbpm8"] Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.622397 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbpm8"] Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.633479 4585 scope.go:117] "RemoveContainer" containerID="096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.642939 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-598xh" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="registry-server" probeResult="failure" output=< Feb 15 17:32:19 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:32:19 crc kubenswrapper[4585]: > Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.701500 4585 scope.go:117] "RemoveContainer" containerID="87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845" Feb 15 17:32:19 crc kubenswrapper[4585]: E0215 17:32:19.702092 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845\": container with ID starting with 87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845 not found: ID does not exist" containerID="87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.702192 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845"} err="failed to get container status \"87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845\": rpc error: code = NotFound desc = could not find container \"87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845\": container with ID starting with 87e74000a49f2f86cd035cd0f6f1356f36a27b1a26fafd5ee261ea1dfcc09845 not found: ID does not exist" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.702264 4585 scope.go:117] "RemoveContainer" containerID="3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf" Feb 15 17:32:19 crc kubenswrapper[4585]: E0215 17:32:19.702778 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf\": container with ID starting with 3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf not found: ID does not exist" containerID="3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.702830 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf"} err="failed to get container status \"3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf\": rpc error: code = NotFound desc = could not find container \"3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf\": container with ID starting with 3b1a65fd8bd91dfe587d38ef71a6a54a311ecd33f410b06adceef47a68518acf not found: ID does not exist" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.702863 4585 scope.go:117] "RemoveContainer" containerID="096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4" Feb 15 17:32:19 crc kubenswrapper[4585]: E0215 17:32:19.703168 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4\": container with ID starting with 096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4 not found: ID does not exist" containerID="096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4" Feb 15 17:32:19 crc kubenswrapper[4585]: I0215 17:32:19.703201 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4"} err="failed to get container status \"096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4\": rpc error: code = NotFound desc = could not find container \"096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4\": container with ID starting with 096b84f4f543109660c119efdcf33dfc9a9bce772ffa0bcbe690acf3f8a5cee4 not found: ID does not exist" Feb 15 17:32:20 crc kubenswrapper[4585]: I0215 17:32:20.859564 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" path="/var/lib/kubelet/pods/54cbb2ea-4b23-4d14-a1b3-9272c7da49be/volumes" Feb 15 17:32:26 crc kubenswrapper[4585]: I0215 17:32:26.057940 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-n4nzj"] Feb 15 17:32:26 crc kubenswrapper[4585]: I0215 17:32:26.074834 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-n4nzj"] Feb 15 17:32:26 crc kubenswrapper[4585]: I0215 17:32:26.843066 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:32:26 crc kubenswrapper[4585]: E0215 17:32:26.843578 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:32:26 crc kubenswrapper[4585]: I0215 17:32:26.857324 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208" path="/var/lib/kubelet/pods/0a91a0fe-5b01-46fc-a0e1-b12f2c8ed208/volumes" Feb 15 17:32:29 crc kubenswrapper[4585]: I0215 17:32:29.647389 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-598xh" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="registry-server" probeResult="failure" output=< Feb 15 17:32:29 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:32:29 crc kubenswrapper[4585]: > Feb 15 17:32:38 crc kubenswrapper[4585]: I0215 17:32:38.650451 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:32:38 crc kubenswrapper[4585]: I0215 17:32:38.707959 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:32:38 crc kubenswrapper[4585]: I0215 17:32:38.884080 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-598xh"] Feb 15 17:32:39 crc kubenswrapper[4585]: I0215 17:32:39.836927 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-598xh" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="registry-server" containerID="cri-o://d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661" gracePeriod=2 Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.292820 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.457500 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-catalog-content\") pod \"047b2db9-40e0-48d7-9525-88aef9590848\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.457629 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdsmv\" (UniqueName: \"kubernetes.io/projected/047b2db9-40e0-48d7-9525-88aef9590848-kube-api-access-sdsmv\") pod \"047b2db9-40e0-48d7-9525-88aef9590848\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.457861 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-utilities\") pod \"047b2db9-40e0-48d7-9525-88aef9590848\" (UID: \"047b2db9-40e0-48d7-9525-88aef9590848\") " Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.458872 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-utilities" (OuterVolumeSpecName: "utilities") pod "047b2db9-40e0-48d7-9525-88aef9590848" (UID: "047b2db9-40e0-48d7-9525-88aef9590848"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.466370 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047b2db9-40e0-48d7-9525-88aef9590848-kube-api-access-sdsmv" (OuterVolumeSpecName: "kube-api-access-sdsmv") pod "047b2db9-40e0-48d7-9525-88aef9590848" (UID: "047b2db9-40e0-48d7-9525-88aef9590848"). InnerVolumeSpecName "kube-api-access-sdsmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.560254 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.560286 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdsmv\" (UniqueName: \"kubernetes.io/projected/047b2db9-40e0-48d7-9525-88aef9590848-kube-api-access-sdsmv\") on node \"crc\" DevicePath \"\"" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.592375 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "047b2db9-40e0-48d7-9525-88aef9590848" (UID: "047b2db9-40e0-48d7-9525-88aef9590848"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.661904 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047b2db9-40e0-48d7-9525-88aef9590848-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.842964 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:32:40 crc kubenswrapper[4585]: E0215 17:32:40.843517 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.858271 4585 generic.go:334] "Generic (PLEG): container finished" podID="047b2db9-40e0-48d7-9525-88aef9590848" containerID="d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661" exitCode=0 Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.858350 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-598xh" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.863008 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-598xh" event={"ID":"047b2db9-40e0-48d7-9525-88aef9590848","Type":"ContainerDied","Data":"d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661"} Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.863203 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-598xh" event={"ID":"047b2db9-40e0-48d7-9525-88aef9590848","Type":"ContainerDied","Data":"4ab54b2d12894acf1f8b9ad03bcc6d6b98f2b34e797cb9d95b58a53520c044a6"} Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.863337 4585 scope.go:117] "RemoveContainer" containerID="d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.911855 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-598xh"] Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.921551 4585 scope.go:117] "RemoveContainer" containerID="6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27" Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.924523 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-598xh"] Feb 15 17:32:40 crc kubenswrapper[4585]: I0215 17:32:40.987347 4585 scope.go:117] "RemoveContainer" containerID="7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736" Feb 15 17:32:41 crc kubenswrapper[4585]: I0215 17:32:41.010792 4585 scope.go:117] "RemoveContainer" containerID="d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661" Feb 15 17:32:41 crc kubenswrapper[4585]: E0215 17:32:41.011192 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661\": container with ID starting with d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661 not found: ID does not exist" containerID="d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661" Feb 15 17:32:41 crc kubenswrapper[4585]: I0215 17:32:41.011249 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661"} err="failed to get container status \"d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661\": rpc error: code = NotFound desc = could not find container \"d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661\": container with ID starting with d37b24adfd1caaee71b5f785996b5f2b181963338a01f4404632ba7738f3c661 not found: ID does not exist" Feb 15 17:32:41 crc kubenswrapper[4585]: I0215 17:32:41.011277 4585 scope.go:117] "RemoveContainer" containerID="6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27" Feb 15 17:32:41 crc kubenswrapper[4585]: E0215 17:32:41.011626 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27\": container with ID starting with 6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27 not found: ID does not exist" containerID="6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27" Feb 15 17:32:41 crc kubenswrapper[4585]: I0215 17:32:41.011673 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27"} err="failed to get container status \"6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27\": rpc error: code = NotFound desc = could not find container \"6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27\": container with ID starting with 6fd658c12db932d94969ae769a7b8226f8faf6944c22d7513a96a52c384e0c27 not found: ID does not exist" Feb 15 17:32:41 crc kubenswrapper[4585]: I0215 17:32:41.011700 4585 scope.go:117] "RemoveContainer" containerID="7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736" Feb 15 17:32:41 crc kubenswrapper[4585]: E0215 17:32:41.011967 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736\": container with ID starting with 7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736 not found: ID does not exist" containerID="7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736" Feb 15 17:32:41 crc kubenswrapper[4585]: I0215 17:32:41.011998 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736"} err="failed to get container status \"7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736\": rpc error: code = NotFound desc = could not find container \"7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736\": container with ID starting with 7f802a8e851b6f7d5fde1d7afdc07a2cb12f6a5273cfdf9280f0a32dd7506736 not found: ID does not exist" Feb 15 17:32:42 crc kubenswrapper[4585]: I0215 17:32:42.863216 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047b2db9-40e0-48d7-9525-88aef9590848" path="/var/lib/kubelet/pods/047b2db9-40e0-48d7-9525-88aef9590848/volumes" Feb 15 17:32:43 crc kubenswrapper[4585]: I0215 17:32:43.055723 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8dngq"] Feb 15 17:32:43 crc kubenswrapper[4585]: I0215 17:32:43.068869 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8dngq"] Feb 15 17:32:43 crc kubenswrapper[4585]: I0215 17:32:43.079789 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ffnrs"] Feb 15 17:32:43 crc kubenswrapper[4585]: I0215 17:32:43.089864 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ffnrs"] Feb 15 17:32:44 crc kubenswrapper[4585]: I0215 17:32:44.862879 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4375904d-94fb-4c2f-804c-5451f7a71c6d" path="/var/lib/kubelet/pods/4375904d-94fb-4c2f-804c-5451f7a71c6d/volumes" Feb 15 17:32:44 crc kubenswrapper[4585]: I0215 17:32:44.865292 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c940d6f6-235b-4817-b022-b5d783c98a5b" path="/var/lib/kubelet/pods/c940d6f6-235b-4817-b022-b5d783c98a5b/volumes" Feb 15 17:32:49 crc kubenswrapper[4585]: I0215 17:32:49.213923 4585 scope.go:117] "RemoveContainer" containerID="029d2b95e658de5b81a2f9dac9fca4cf2aa230894079bc8f0093bfc898202f85" Feb 15 17:32:49 crc kubenswrapper[4585]: I0215 17:32:49.277305 4585 scope.go:117] "RemoveContainer" containerID="5bf1fcaa90b05a0c96ca326dbe9bccfacc8b5baf8af9a1934e391e9e7d721635" Feb 15 17:32:49 crc kubenswrapper[4585]: I0215 17:32:49.344118 4585 scope.go:117] "RemoveContainer" containerID="e1adad7efb0945afc416ec66fecc4e5baf715bfe31534719b77a3ad1a0841147" Feb 15 17:32:49 crc kubenswrapper[4585]: I0215 17:32:49.416042 4585 scope.go:117] "RemoveContainer" containerID="4b7b82c7cabafbb1eb484dbe13a73bfb2d2e3c4c08e9918033f57f5ac7c93be7" Feb 15 17:32:53 crc kubenswrapper[4585]: I0215 17:32:53.842484 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:32:53 crc kubenswrapper[4585]: E0215 17:32:53.843485 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:32:57 crc kubenswrapper[4585]: I0215 17:32:57.068232 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5zw55"] Feb 15 17:32:57 crc kubenswrapper[4585]: I0215 17:32:57.087066 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-4vdfm"] Feb 15 17:32:57 crc kubenswrapper[4585]: I0215 17:32:57.106690 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5zw55"] Feb 15 17:32:57 crc kubenswrapper[4585]: I0215 17:32:57.121519 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-4vdfm"] Feb 15 17:32:58 crc kubenswrapper[4585]: I0215 17:32:58.855723 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3701151b-dc31-421f-a1e1-9d694e13bc86" path="/var/lib/kubelet/pods/3701151b-dc31-421f-a1e1-9d694e13bc86/volumes" Feb 15 17:32:58 crc kubenswrapper[4585]: I0215 17:32:58.857195 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff" path="/var/lib/kubelet/pods/ad17e3ba-5aaf-4c1a-8490-0e3c1c56aaff/volumes" Feb 15 17:33:05 crc kubenswrapper[4585]: I0215 17:33:05.841849 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:33:05 crc kubenswrapper[4585]: E0215 17:33:05.842831 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:33:18 crc kubenswrapper[4585]: I0215 17:33:18.842959 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:33:18 crc kubenswrapper[4585]: E0215 17:33:18.844845 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:33:33 crc kubenswrapper[4585]: I0215 17:33:33.842326 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:33:33 crc kubenswrapper[4585]: E0215 17:33:33.843441 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:33:45 crc kubenswrapper[4585]: I0215 17:33:45.043029 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-dbmb7"] Feb 15 17:33:45 crc kubenswrapper[4585]: I0215 17:33:45.056224 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-dbmb7"] Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.050681 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-f6lj7"] Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.064773 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-332c-account-create-update-pdlp9"] Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.076808 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-x6mtt"] Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.085617 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-f6lj7"] Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.093650 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-332c-account-create-update-pdlp9"] Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.100886 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-x6mtt"] Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.856775 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38468fa5-3373-42c6-88a2-3b405081fd2f" path="/var/lib/kubelet/pods/38468fa5-3373-42c6-88a2-3b405081fd2f/volumes" Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.857501 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="562720ec-ae91-4da4-874f-e61327e5b850" path="/var/lib/kubelet/pods/562720ec-ae91-4da4-874f-e61327e5b850/volumes" Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.858318 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ec1717-0d15-46bd-bfa9-e00997de9192" path="/var/lib/kubelet/pods/b2ec1717-0d15-46bd-bfa9-e00997de9192/volumes" Feb 15 17:33:46 crc kubenswrapper[4585]: I0215 17:33:46.859014 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4035ab3-dd31-461d-a31c-d4c01cecd67e" path="/var/lib/kubelet/pods/e4035ab3-dd31-461d-a31c-d4c01cecd67e/volumes" Feb 15 17:33:47 crc kubenswrapper[4585]: I0215 17:33:47.024568 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-854d-account-create-update-hkbvm"] Feb 15 17:33:47 crc kubenswrapper[4585]: I0215 17:33:47.032278 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-854d-account-create-update-hkbvm"] Feb 15 17:33:48 crc kubenswrapper[4585]: I0215 17:33:48.038768 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1712-account-create-update-m7x8g"] Feb 15 17:33:48 crc kubenswrapper[4585]: I0215 17:33:48.050585 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1712-account-create-update-m7x8g"] Feb 15 17:33:48 crc kubenswrapper[4585]: I0215 17:33:48.842097 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:33:48 crc kubenswrapper[4585]: E0215 17:33:48.842897 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:33:48 crc kubenswrapper[4585]: I0215 17:33:48.856263 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f90b2d-7253-4232-b9bc-ab80a39d2a86" path="/var/lib/kubelet/pods/93f90b2d-7253-4232-b9bc-ab80a39d2a86/volumes" Feb 15 17:33:48 crc kubenswrapper[4585]: I0215 17:33:48.858059 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd10677c-322d-4176-a8ac-85e603cd52c8" path="/var/lib/kubelet/pods/bd10677c-322d-4176-a8ac-85e603cd52c8/volumes" Feb 15 17:33:49 crc kubenswrapper[4585]: I0215 17:33:49.644055 4585 scope.go:117] "RemoveContainer" containerID="4d7b1f6153119c17f82d34badc87046a80b47b8aba51ce492d52ac17fd4e77a0" Feb 15 17:33:49 crc kubenswrapper[4585]: I0215 17:33:49.689057 4585 scope.go:117] "RemoveContainer" containerID="6baf9086dabfeb2c227dda15190eb426fb749627268ece14ee5bcfb2c590e49d" Feb 15 17:33:49 crc kubenswrapper[4585]: I0215 17:33:49.793383 4585 scope.go:117] "RemoveContainer" containerID="78202bf3429568a1bb1342e11f91a0d256b1b67754b3fd19846c01647926b594" Feb 15 17:33:49 crc kubenswrapper[4585]: I0215 17:33:49.825170 4585 scope.go:117] "RemoveContainer" containerID="5e0c0a675f820feae9c964d4feb9b26af060c2f5499275dbd3c24a8680b23a24" Feb 15 17:33:49 crc kubenswrapper[4585]: I0215 17:33:49.865981 4585 scope.go:117] "RemoveContainer" containerID="a7ccd1dd192c7184c48b840e9aada9850e6ee7d83d8d08c4edf6f0bd519c579f" Feb 15 17:33:49 crc kubenswrapper[4585]: I0215 17:33:49.910909 4585 scope.go:117] "RemoveContainer" containerID="a65d5a4a553c51156a7136c874dde7a15f017f6e6b2e80ef2e0797d2a81b178c" Feb 15 17:33:49 crc kubenswrapper[4585]: I0215 17:33:49.956015 4585 scope.go:117] "RemoveContainer" containerID="6e8b18e9e15054974ea5fddce62a4420ba7867ba4d3ad643f19a3d1044507198" Feb 15 17:33:49 crc kubenswrapper[4585]: I0215 17:33:49.981425 4585 scope.go:117] "RemoveContainer" containerID="5ff4d5723ca71a31e08385d92efbd2c524e765f3b6516afd8a057d919d6d70f2" Feb 15 17:34:00 crc kubenswrapper[4585]: I0215 17:34:00.842650 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:34:00 crc kubenswrapper[4585]: E0215 17:34:00.843961 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:34:12 crc kubenswrapper[4585]: I0215 17:34:12.842414 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:34:12 crc kubenswrapper[4585]: E0215 17:34:12.843414 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:34:25 crc kubenswrapper[4585]: I0215 17:34:25.842636 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:34:25 crc kubenswrapper[4585]: E0215 17:34:25.843536 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:34:37 crc kubenswrapper[4585]: I0215 17:34:37.058447 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd86j"] Feb 15 17:34:37 crc kubenswrapper[4585]: I0215 17:34:37.073121 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-kd86j"] Feb 15 17:34:38 crc kubenswrapper[4585]: I0215 17:34:38.889485 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3921dd00-14a5-4825-b135-5acc7a95a162" path="/var/lib/kubelet/pods/3921dd00-14a5-4825-b135-5acc7a95a162/volumes" Feb 15 17:34:39 crc kubenswrapper[4585]: I0215 17:34:39.844807 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:34:39 crc kubenswrapper[4585]: E0215 17:34:39.845982 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:34:50 crc kubenswrapper[4585]: I0215 17:34:50.200135 4585 scope.go:117] "RemoveContainer" containerID="af093457829ccbf6cf287f4fadffcdeb3aae7d3500b8e397a433515a18e4853f" Feb 15 17:34:53 crc kubenswrapper[4585]: I0215 17:34:53.842904 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:34:53 crc kubenswrapper[4585]: E0215 17:34:53.843475 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:35:00 crc kubenswrapper[4585]: I0215 17:35:00.053663 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-rrsg2"] Feb 15 17:35:00 crc kubenswrapper[4585]: I0215 17:35:00.065075 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-rrsg2"] Feb 15 17:35:00 crc kubenswrapper[4585]: I0215 17:35:00.858933 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77018408-f0fc-4655-904c-9090777a235e" path="/var/lib/kubelet/pods/77018408-f0fc-4655-904c-9090777a235e/volumes" Feb 15 17:35:08 crc kubenswrapper[4585]: I0215 17:35:08.841949 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:35:08 crc kubenswrapper[4585]: E0215 17:35:08.842842 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:35:22 crc kubenswrapper[4585]: I0215 17:35:22.842204 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:35:22 crc kubenswrapper[4585]: E0215 17:35:22.843072 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:35:37 crc kubenswrapper[4585]: I0215 17:35:37.841278 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:35:37 crc kubenswrapper[4585]: E0215 17:35:37.842039 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:35:50 crc kubenswrapper[4585]: I0215 17:35:50.294782 4585 scope.go:117] "RemoveContainer" containerID="1713b69a41f76dfdc3517ef459910e24c774cfa58a29f1ef62d7bee5d631cafc" Feb 15 17:35:52 crc kubenswrapper[4585]: I0215 17:35:52.842767 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:35:53 crc kubenswrapper[4585]: I0215 17:35:53.277816 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"9179625596c50bb82a01dd7171573c41984b6c211e5abac0a24996fe5bae734e"} Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.231539 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s466s"] Feb 15 17:36:23 crc kubenswrapper[4585]: E0215 17:36:23.237341 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerName="extract-content" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.237368 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerName="extract-content" Feb 15 17:36:23 crc kubenswrapper[4585]: E0215 17:36:23.237421 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="extract-utilities" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.237436 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="extract-utilities" Feb 15 17:36:23 crc kubenswrapper[4585]: E0215 17:36:23.237460 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="extract-content" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.237473 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="extract-content" Feb 15 17:36:23 crc kubenswrapper[4585]: E0215 17:36:23.237504 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerName="registry-server" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.237513 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerName="registry-server" Feb 15 17:36:23 crc kubenswrapper[4585]: E0215 17:36:23.237553 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="registry-server" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.237561 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="registry-server" Feb 15 17:36:23 crc kubenswrapper[4585]: E0215 17:36:23.237578 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerName="extract-utilities" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.237585 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerName="extract-utilities" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.237851 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="54cbb2ea-4b23-4d14-a1b3-9272c7da49be" containerName="registry-server" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.238004 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="047b2db9-40e0-48d7-9525-88aef9590848" containerName="registry-server" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.251748 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.284352 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s466s"] Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.424534 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-catalog-content\") pod \"certified-operators-s466s\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.424696 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-utilities\") pod \"certified-operators-s466s\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.424732 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wjr\" (UniqueName: \"kubernetes.io/projected/5dc6c047-f800-4bbf-b0df-c379cc1188d1-kube-api-access-26wjr\") pod \"certified-operators-s466s\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.526733 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-utilities\") pod \"certified-operators-s466s\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.526789 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wjr\" (UniqueName: \"kubernetes.io/projected/5dc6c047-f800-4bbf-b0df-c379cc1188d1-kube-api-access-26wjr\") pod \"certified-operators-s466s\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.526840 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-catalog-content\") pod \"certified-operators-s466s\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.527211 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-utilities\") pod \"certified-operators-s466s\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.527243 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-catalog-content\") pod \"certified-operators-s466s\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.554943 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wjr\" (UniqueName: \"kubernetes.io/projected/5dc6c047-f800-4bbf-b0df-c379cc1188d1-kube-api-access-26wjr\") pod \"certified-operators-s466s\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:23 crc kubenswrapper[4585]: I0215 17:36:23.584737 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:24 crc kubenswrapper[4585]: I0215 17:36:24.112642 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s466s"] Feb 15 17:36:24 crc kubenswrapper[4585]: I0215 17:36:24.655416 4585 generic.go:334] "Generic (PLEG): container finished" podID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerID="ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c" exitCode=0 Feb 15 17:36:24 crc kubenswrapper[4585]: I0215 17:36:24.655475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s466s" event={"ID":"5dc6c047-f800-4bbf-b0df-c379cc1188d1","Type":"ContainerDied","Data":"ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c"} Feb 15 17:36:24 crc kubenswrapper[4585]: I0215 17:36:24.655516 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s466s" event={"ID":"5dc6c047-f800-4bbf-b0df-c379cc1188d1","Type":"ContainerStarted","Data":"888fb94dc4882832edae9881fc955d810e2cd11787b6e00ab01dc95ae36f1039"} Feb 15 17:36:25 crc kubenswrapper[4585]: I0215 17:36:25.666281 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s466s" event={"ID":"5dc6c047-f800-4bbf-b0df-c379cc1188d1","Type":"ContainerStarted","Data":"728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461"} Feb 15 17:36:27 crc kubenswrapper[4585]: I0215 17:36:27.692942 4585 generic.go:334] "Generic (PLEG): container finished" podID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerID="728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461" exitCode=0 Feb 15 17:36:27 crc kubenswrapper[4585]: I0215 17:36:27.693180 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s466s" event={"ID":"5dc6c047-f800-4bbf-b0df-c379cc1188d1","Type":"ContainerDied","Data":"728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461"} Feb 15 17:36:28 crc kubenswrapper[4585]: I0215 17:36:28.712714 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s466s" event={"ID":"5dc6c047-f800-4bbf-b0df-c379cc1188d1","Type":"ContainerStarted","Data":"71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183"} Feb 15 17:36:28 crc kubenswrapper[4585]: I0215 17:36:28.743143 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s466s" podStartSLOduration=2.328263056 podStartE2EDuration="5.7431272s" podCreationTimestamp="2026-02-15 17:36:23 +0000 UTC" firstStartedPulling="2026-02-15 17:36:24.660267768 +0000 UTC m=+1840.603675890" lastFinishedPulling="2026-02-15 17:36:28.075131892 +0000 UTC m=+1844.018540034" observedRunningTime="2026-02-15 17:36:28.738516195 +0000 UTC m=+1844.681924367" watchObservedRunningTime="2026-02-15 17:36:28.7431272 +0000 UTC m=+1844.686535332" Feb 15 17:36:33 crc kubenswrapper[4585]: I0215 17:36:33.585175 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:33 crc kubenswrapper[4585]: I0215 17:36:33.585719 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:34 crc kubenswrapper[4585]: I0215 17:36:34.654409 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-s466s" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerName="registry-server" probeResult="failure" output=< Feb 15 17:36:34 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:36:34 crc kubenswrapper[4585]: > Feb 15 17:36:43 crc kubenswrapper[4585]: I0215 17:36:43.671099 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:43 crc kubenswrapper[4585]: I0215 17:36:43.748029 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:43 crc kubenswrapper[4585]: I0215 17:36:43.910546 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s466s"] Feb 15 17:36:44 crc kubenswrapper[4585]: I0215 17:36:44.910012 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s466s" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerName="registry-server" containerID="cri-o://71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183" gracePeriod=2 Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.355343 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.549570 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-catalog-content\") pod \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.549668 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26wjr\" (UniqueName: \"kubernetes.io/projected/5dc6c047-f800-4bbf-b0df-c379cc1188d1-kube-api-access-26wjr\") pod \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.549816 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-utilities\") pod \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\" (UID: \"5dc6c047-f800-4bbf-b0df-c379cc1188d1\") " Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.550861 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-utilities" (OuterVolumeSpecName: "utilities") pod "5dc6c047-f800-4bbf-b0df-c379cc1188d1" (UID: "5dc6c047-f800-4bbf-b0df-c379cc1188d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.560920 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc6c047-f800-4bbf-b0df-c379cc1188d1-kube-api-access-26wjr" (OuterVolumeSpecName: "kube-api-access-26wjr") pod "5dc6c047-f800-4bbf-b0df-c379cc1188d1" (UID: "5dc6c047-f800-4bbf-b0df-c379cc1188d1"). InnerVolumeSpecName "kube-api-access-26wjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.598503 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dc6c047-f800-4bbf-b0df-c379cc1188d1" (UID: "5dc6c047-f800-4bbf-b0df-c379cc1188d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.652037 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.652070 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26wjr\" (UniqueName: \"kubernetes.io/projected/5dc6c047-f800-4bbf-b0df-c379cc1188d1-kube-api-access-26wjr\") on node \"crc\" DevicePath \"\"" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.652082 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dc6c047-f800-4bbf-b0df-c379cc1188d1-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.924651 4585 generic.go:334] "Generic (PLEG): container finished" podID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerID="71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183" exitCode=0 Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.924723 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s466s" event={"ID":"5dc6c047-f800-4bbf-b0df-c379cc1188d1","Type":"ContainerDied","Data":"71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183"} Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.924703 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s466s" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.924790 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s466s" event={"ID":"5dc6c047-f800-4bbf-b0df-c379cc1188d1","Type":"ContainerDied","Data":"888fb94dc4882832edae9881fc955d810e2cd11787b6e00ab01dc95ae36f1039"} Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.924851 4585 scope.go:117] "RemoveContainer" containerID="71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.964110 4585 scope.go:117] "RemoveContainer" containerID="728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461" Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.981013 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s466s"] Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.993084 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s466s"] Feb 15 17:36:45 crc kubenswrapper[4585]: I0215 17:36:45.996165 4585 scope.go:117] "RemoveContainer" containerID="ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c" Feb 15 17:36:46 crc kubenswrapper[4585]: I0215 17:36:46.080002 4585 scope.go:117] "RemoveContainer" containerID="71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183" Feb 15 17:36:46 crc kubenswrapper[4585]: E0215 17:36:46.080486 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183\": container with ID starting with 71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183 not found: ID does not exist" containerID="71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183" Feb 15 17:36:46 crc kubenswrapper[4585]: I0215 17:36:46.080530 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183"} err="failed to get container status \"71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183\": rpc error: code = NotFound desc = could not find container \"71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183\": container with ID starting with 71fcc701f093cfb1fe94dc1c7e933546db6b273f97e763798f2e9bd78c1e1183 not found: ID does not exist" Feb 15 17:36:46 crc kubenswrapper[4585]: I0215 17:36:46.080556 4585 scope.go:117] "RemoveContainer" containerID="728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461" Feb 15 17:36:46 crc kubenswrapper[4585]: E0215 17:36:46.081417 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461\": container with ID starting with 728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461 not found: ID does not exist" containerID="728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461" Feb 15 17:36:46 crc kubenswrapper[4585]: I0215 17:36:46.081486 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461"} err="failed to get container status \"728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461\": rpc error: code = NotFound desc = could not find container \"728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461\": container with ID starting with 728113fab7d8ba09cf4463603e960211e499c0665e9f5acf438fd83d4ab7e461 not found: ID does not exist" Feb 15 17:36:46 crc kubenswrapper[4585]: I0215 17:36:46.081532 4585 scope.go:117] "RemoveContainer" containerID="ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c" Feb 15 17:36:46 crc kubenswrapper[4585]: E0215 17:36:46.081997 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c\": container with ID starting with ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c not found: ID does not exist" containerID="ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c" Feb 15 17:36:46 crc kubenswrapper[4585]: I0215 17:36:46.082023 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c"} err="failed to get container status \"ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c\": rpc error: code = NotFound desc = could not find container \"ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c\": container with ID starting with ec4531177094c44d3dc306e57f4a092e235edf82409893a64d9269635de2416c not found: ID does not exist" Feb 15 17:36:46 crc kubenswrapper[4585]: I0215 17:36:46.882467 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" path="/var/lib/kubelet/pods/5dc6c047-f800-4bbf-b0df-c379cc1188d1/volumes" Feb 15 17:38:17 crc kubenswrapper[4585]: I0215 17:38:17.013721 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:38:17 crc kubenswrapper[4585]: I0215 17:38:17.014196 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:38:47 crc kubenswrapper[4585]: I0215 17:38:47.014176 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:38:47 crc kubenswrapper[4585]: I0215 17:38:47.014684 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:39:17 crc kubenswrapper[4585]: I0215 17:39:17.014732 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:39:17 crc kubenswrapper[4585]: I0215 17:39:17.015357 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:39:17 crc kubenswrapper[4585]: I0215 17:39:17.015408 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:39:17 crc kubenswrapper[4585]: I0215 17:39:17.016249 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9179625596c50bb82a01dd7171573c41984b6c211e5abac0a24996fe5bae734e"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:39:17 crc kubenswrapper[4585]: I0215 17:39:17.016316 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://9179625596c50bb82a01dd7171573c41984b6c211e5abac0a24996fe5bae734e" gracePeriod=600 Feb 15 17:39:17 crc kubenswrapper[4585]: I0215 17:39:17.885499 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="9179625596c50bb82a01dd7171573c41984b6c211e5abac0a24996fe5bae734e" exitCode=0 Feb 15 17:39:17 crc kubenswrapper[4585]: I0215 17:39:17.885620 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"9179625596c50bb82a01dd7171573c41984b6c211e5abac0a24996fe5bae734e"} Feb 15 17:39:17 crc kubenswrapper[4585]: I0215 17:39:17.885961 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37"} Feb 15 17:39:17 crc kubenswrapper[4585]: I0215 17:39:17.885996 4585 scope.go:117] "RemoveContainer" containerID="8dc719e77c4155783a8791bb15c039afd74cb15ba4a4a09e47c126b86218159d" Feb 15 17:41:17 crc kubenswrapper[4585]: I0215 17:41:17.014493 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:41:17 crc kubenswrapper[4585]: I0215 17:41:17.015170 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:41:47 crc kubenswrapper[4585]: I0215 17:41:47.014682 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:41:47 crc kubenswrapper[4585]: I0215 17:41:47.015087 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.242111 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mvnpj"] Feb 15 17:41:49 crc kubenswrapper[4585]: E0215 17:41:49.244869 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerName="registry-server" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.245078 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerName="registry-server" Feb 15 17:41:49 crc kubenswrapper[4585]: E0215 17:41:49.245245 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerName="extract-content" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.245371 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerName="extract-content" Feb 15 17:41:49 crc kubenswrapper[4585]: E0215 17:41:49.245509 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerName="extract-utilities" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.245659 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerName="extract-utilities" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.246203 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc6c047-f800-4bbf-b0df-c379cc1188d1" containerName="registry-server" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.249369 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.256142 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mvnpj"] Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.315910 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxjz8\" (UniqueName: \"kubernetes.io/projected/a7c7880b-3e09-4295-8d8b-6145fa7e887f-kube-api-access-cxjz8\") pod \"community-operators-mvnpj\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.316182 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-utilities\") pod \"community-operators-mvnpj\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.316333 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-catalog-content\") pod \"community-operators-mvnpj\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.418195 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjz8\" (UniqueName: \"kubernetes.io/projected/a7c7880b-3e09-4295-8d8b-6145fa7e887f-kube-api-access-cxjz8\") pod \"community-operators-mvnpj\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.418251 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-utilities\") pod \"community-operators-mvnpj\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.418765 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-utilities\") pod \"community-operators-mvnpj\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.418806 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-catalog-content\") pod \"community-operators-mvnpj\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.418523 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-catalog-content\") pod \"community-operators-mvnpj\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.471389 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxjz8\" (UniqueName: \"kubernetes.io/projected/a7c7880b-3e09-4295-8d8b-6145fa7e887f-kube-api-access-cxjz8\") pod \"community-operators-mvnpj\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:49 crc kubenswrapper[4585]: I0215 17:41:49.566081 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:50 crc kubenswrapper[4585]: I0215 17:41:50.309719 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mvnpj"] Feb 15 17:41:50 crc kubenswrapper[4585]: I0215 17:41:50.693870 4585 generic.go:334] "Generic (PLEG): container finished" podID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerID="4fed199d6d4e939140c081afb4edf4a68eaf66d86471e8496d768278b48eb8ae" exitCode=0 Feb 15 17:41:50 crc kubenswrapper[4585]: I0215 17:41:50.693973 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvnpj" event={"ID":"a7c7880b-3e09-4295-8d8b-6145fa7e887f","Type":"ContainerDied","Data":"4fed199d6d4e939140c081afb4edf4a68eaf66d86471e8496d768278b48eb8ae"} Feb 15 17:41:50 crc kubenswrapper[4585]: I0215 17:41:50.694170 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvnpj" event={"ID":"a7c7880b-3e09-4295-8d8b-6145fa7e887f","Type":"ContainerStarted","Data":"f8d9763479d43aa94a4cd4a0c21f000e9d0eda8869c2fdee0d24d2d06c059cb0"} Feb 15 17:41:50 crc kubenswrapper[4585]: I0215 17:41:50.695734 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 15 17:41:51 crc kubenswrapper[4585]: I0215 17:41:51.706071 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvnpj" event={"ID":"a7c7880b-3e09-4295-8d8b-6145fa7e887f","Type":"ContainerStarted","Data":"42d08609bc8304b291f645c32f33aadb68f7569067743c29a266c0758be07b7e"} Feb 15 17:41:53 crc kubenswrapper[4585]: I0215 17:41:53.738573 4585 generic.go:334] "Generic (PLEG): container finished" podID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerID="42d08609bc8304b291f645c32f33aadb68f7569067743c29a266c0758be07b7e" exitCode=0 Feb 15 17:41:53 crc kubenswrapper[4585]: I0215 17:41:53.738628 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvnpj" event={"ID":"a7c7880b-3e09-4295-8d8b-6145fa7e887f","Type":"ContainerDied","Data":"42d08609bc8304b291f645c32f33aadb68f7569067743c29a266c0758be07b7e"} Feb 15 17:41:54 crc kubenswrapper[4585]: I0215 17:41:54.751211 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvnpj" event={"ID":"a7c7880b-3e09-4295-8d8b-6145fa7e887f","Type":"ContainerStarted","Data":"175d318637defd4ab54ba89630b14c9c65517f656a445d1906a435ab27c8d3de"} Feb 15 17:41:54 crc kubenswrapper[4585]: I0215 17:41:54.779432 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mvnpj" podStartSLOduration=2.192795236 podStartE2EDuration="5.779412964s" podCreationTimestamp="2026-02-15 17:41:49 +0000 UTC" firstStartedPulling="2026-02-15 17:41:50.695507802 +0000 UTC m=+2166.638915934" lastFinishedPulling="2026-02-15 17:41:54.2821255 +0000 UTC m=+2170.225533662" observedRunningTime="2026-02-15 17:41:54.774886753 +0000 UTC m=+2170.718294885" watchObservedRunningTime="2026-02-15 17:41:54.779412964 +0000 UTC m=+2170.722821106" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.566917 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.567566 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.615377 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qmzj7"] Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.617459 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.642092 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmzj7"] Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.699047 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.803869 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-utilities\") pod \"redhat-operators-qmzj7\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.803927 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w84qg\" (UniqueName: \"kubernetes.io/projected/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-kube-api-access-w84qg\") pod \"redhat-operators-qmzj7\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.804014 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-catalog-content\") pod \"redhat-operators-qmzj7\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.865741 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.906502 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-catalog-content\") pod \"redhat-operators-qmzj7\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.906648 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-utilities\") pod \"redhat-operators-qmzj7\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.906693 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w84qg\" (UniqueName: \"kubernetes.io/projected/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-kube-api-access-w84qg\") pod \"redhat-operators-qmzj7\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.907058 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-utilities\") pod \"redhat-operators-qmzj7\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.907204 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-catalog-content\") pod \"redhat-operators-qmzj7\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.928767 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w84qg\" (UniqueName: \"kubernetes.io/projected/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-kube-api-access-w84qg\") pod \"redhat-operators-qmzj7\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:41:59 crc kubenswrapper[4585]: I0215 17:41:59.933276 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:42:00 crc kubenswrapper[4585]: I0215 17:42:00.391352 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmzj7"] Feb 15 17:42:00 crc kubenswrapper[4585]: I0215 17:42:00.818524 4585 generic.go:334] "Generic (PLEG): container finished" podID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerID="080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f" exitCode=0 Feb 15 17:42:00 crc kubenswrapper[4585]: I0215 17:42:00.818899 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmzj7" event={"ID":"c7a5b254-dcc2-492a-be46-d40c8ad5eca1","Type":"ContainerDied","Data":"080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f"} Feb 15 17:42:00 crc kubenswrapper[4585]: I0215 17:42:00.818923 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmzj7" event={"ID":"c7a5b254-dcc2-492a-be46-d40c8ad5eca1","Type":"ContainerStarted","Data":"05e3da3fe37e8730e422cff3c1cc66954083eff64ecc9530d9f9aca4a6846bde"} Feb 15 17:42:01 crc kubenswrapper[4585]: I0215 17:42:01.852538 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmzj7" event={"ID":"c7a5b254-dcc2-492a-be46-d40c8ad5eca1","Type":"ContainerStarted","Data":"16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b"} Feb 15 17:42:02 crc kubenswrapper[4585]: I0215 17:42:02.604586 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mvnpj"] Feb 15 17:42:02 crc kubenswrapper[4585]: I0215 17:42:02.605088 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mvnpj" podUID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerName="registry-server" containerID="cri-o://175d318637defd4ab54ba89630b14c9c65517f656a445d1906a435ab27c8d3de" gracePeriod=2 Feb 15 17:42:02 crc kubenswrapper[4585]: I0215 17:42:02.862981 4585 generic.go:334] "Generic (PLEG): container finished" podID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerID="175d318637defd4ab54ba89630b14c9c65517f656a445d1906a435ab27c8d3de" exitCode=0 Feb 15 17:42:02 crc kubenswrapper[4585]: I0215 17:42:02.863815 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvnpj" event={"ID":"a7c7880b-3e09-4295-8d8b-6145fa7e887f","Type":"ContainerDied","Data":"175d318637defd4ab54ba89630b14c9c65517f656a445d1906a435ab27c8d3de"} Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.096297 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.281234 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxjz8\" (UniqueName: \"kubernetes.io/projected/a7c7880b-3e09-4295-8d8b-6145fa7e887f-kube-api-access-cxjz8\") pod \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.281418 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-utilities\") pod \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.281445 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-catalog-content\") pod \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\" (UID: \"a7c7880b-3e09-4295-8d8b-6145fa7e887f\") " Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.282468 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-utilities" (OuterVolumeSpecName: "utilities") pod "a7c7880b-3e09-4295-8d8b-6145fa7e887f" (UID: "a7c7880b-3e09-4295-8d8b-6145fa7e887f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.291861 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c7880b-3e09-4295-8d8b-6145fa7e887f-kube-api-access-cxjz8" (OuterVolumeSpecName: "kube-api-access-cxjz8") pod "a7c7880b-3e09-4295-8d8b-6145fa7e887f" (UID: "a7c7880b-3e09-4295-8d8b-6145fa7e887f"). InnerVolumeSpecName "kube-api-access-cxjz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.327662 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7c7880b-3e09-4295-8d8b-6145fa7e887f" (UID: "a7c7880b-3e09-4295-8d8b-6145fa7e887f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.383833 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxjz8\" (UniqueName: \"kubernetes.io/projected/a7c7880b-3e09-4295-8d8b-6145fa7e887f-kube-api-access-cxjz8\") on node \"crc\" DevicePath \"\"" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.383865 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.383873 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7c7880b-3e09-4295-8d8b-6145fa7e887f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.875689 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvnpj" event={"ID":"a7c7880b-3e09-4295-8d8b-6145fa7e887f","Type":"ContainerDied","Data":"f8d9763479d43aa94a4cd4a0c21f000e9d0eda8869c2fdee0d24d2d06c059cb0"} Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.875975 4585 scope.go:117] "RemoveContainer" containerID="175d318637defd4ab54ba89630b14c9c65517f656a445d1906a435ab27c8d3de" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.876118 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvnpj" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.915731 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mvnpj"] Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.918272 4585 scope.go:117] "RemoveContainer" containerID="42d08609bc8304b291f645c32f33aadb68f7569067743c29a266c0758be07b7e" Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.927657 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mvnpj"] Feb 15 17:42:03 crc kubenswrapper[4585]: I0215 17:42:03.940423 4585 scope.go:117] "RemoveContainer" containerID="4fed199d6d4e939140c081afb4edf4a68eaf66d86471e8496d768278b48eb8ae" Feb 15 17:42:04 crc kubenswrapper[4585]: I0215 17:42:04.864982 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" path="/var/lib/kubelet/pods/a7c7880b-3e09-4295-8d8b-6145fa7e887f/volumes" Feb 15 17:42:06 crc kubenswrapper[4585]: I0215 17:42:06.919570 4585 generic.go:334] "Generic (PLEG): container finished" podID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerID="16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b" exitCode=0 Feb 15 17:42:06 crc kubenswrapper[4585]: I0215 17:42:06.919691 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmzj7" event={"ID":"c7a5b254-dcc2-492a-be46-d40c8ad5eca1","Type":"ContainerDied","Data":"16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b"} Feb 15 17:42:07 crc kubenswrapper[4585]: I0215 17:42:07.930233 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmzj7" event={"ID":"c7a5b254-dcc2-492a-be46-d40c8ad5eca1","Type":"ContainerStarted","Data":"acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8"} Feb 15 17:42:07 crc kubenswrapper[4585]: I0215 17:42:07.959355 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qmzj7" podStartSLOduration=2.500018491 podStartE2EDuration="8.959338042s" podCreationTimestamp="2026-02-15 17:41:59 +0000 UTC" firstStartedPulling="2026-02-15 17:42:00.820398935 +0000 UTC m=+2176.763807067" lastFinishedPulling="2026-02-15 17:42:07.279718476 +0000 UTC m=+2183.223126618" observedRunningTime="2026-02-15 17:42:07.953690201 +0000 UTC m=+2183.897098353" watchObservedRunningTime="2026-02-15 17:42:07.959338042 +0000 UTC m=+2183.902746184" Feb 15 17:42:09 crc kubenswrapper[4585]: I0215 17:42:09.934379 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:42:09 crc kubenswrapper[4585]: I0215 17:42:09.934904 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:42:10 crc kubenswrapper[4585]: I0215 17:42:10.995675 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qmzj7" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="registry-server" probeResult="failure" output=< Feb 15 17:42:10 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:42:10 crc kubenswrapper[4585]: > Feb 15 17:42:17 crc kubenswrapper[4585]: I0215 17:42:17.013936 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:42:17 crc kubenswrapper[4585]: I0215 17:42:17.014473 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:42:17 crc kubenswrapper[4585]: I0215 17:42:17.014522 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:42:17 crc kubenswrapper[4585]: I0215 17:42:17.015310 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:42:17 crc kubenswrapper[4585]: I0215 17:42:17.015351 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" gracePeriod=600 Feb 15 17:42:17 crc kubenswrapper[4585]: E0215 17:42:17.162066 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:42:18 crc kubenswrapper[4585]: I0215 17:42:18.061299 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" exitCode=0 Feb 15 17:42:18 crc kubenswrapper[4585]: I0215 17:42:18.061449 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37"} Feb 15 17:42:18 crc kubenswrapper[4585]: I0215 17:42:18.061507 4585 scope.go:117] "RemoveContainer" containerID="9179625596c50bb82a01dd7171573c41984b6c211e5abac0a24996fe5bae734e" Feb 15 17:42:18 crc kubenswrapper[4585]: I0215 17:42:18.063238 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:42:18 crc kubenswrapper[4585]: E0215 17:42:18.066489 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:42:20 crc kubenswrapper[4585]: I0215 17:42:20.988454 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qmzj7" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="registry-server" probeResult="failure" output=< Feb 15 17:42:20 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:42:20 crc kubenswrapper[4585]: > Feb 15 17:42:29 crc kubenswrapper[4585]: I0215 17:42:29.996422 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:42:30 crc kubenswrapper[4585]: I0215 17:42:30.062249 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:42:30 crc kubenswrapper[4585]: I0215 17:42:30.841026 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qmzj7"] Feb 15 17:42:30 crc kubenswrapper[4585]: I0215 17:42:30.842693 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:42:30 crc kubenswrapper[4585]: E0215 17:42:30.843413 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.206702 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qmzj7" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="registry-server" containerID="cri-o://acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8" gracePeriod=2 Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.658775 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.684457 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w84qg\" (UniqueName: \"kubernetes.io/projected/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-kube-api-access-w84qg\") pod \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.702843 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-kube-api-access-w84qg" (OuterVolumeSpecName: "kube-api-access-w84qg") pod "c7a5b254-dcc2-492a-be46-d40c8ad5eca1" (UID: "c7a5b254-dcc2-492a-be46-d40c8ad5eca1"). InnerVolumeSpecName "kube-api-access-w84qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.786812 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-utilities\") pod \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.787008 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-catalog-content\") pod \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\" (UID: \"c7a5b254-dcc2-492a-be46-d40c8ad5eca1\") " Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.787581 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-utilities" (OuterVolumeSpecName: "utilities") pod "c7a5b254-dcc2-492a-be46-d40c8ad5eca1" (UID: "c7a5b254-dcc2-492a-be46-d40c8ad5eca1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.787786 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w84qg\" (UniqueName: \"kubernetes.io/projected/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-kube-api-access-w84qg\") on node \"crc\" DevicePath \"\"" Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.787810 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.899311 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7a5b254-dcc2-492a-be46-d40c8ad5eca1" (UID: "c7a5b254-dcc2-492a-be46-d40c8ad5eca1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:42:31 crc kubenswrapper[4585]: I0215 17:42:31.991901 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7a5b254-dcc2-492a-be46-d40c8ad5eca1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.219476 4585 generic.go:334] "Generic (PLEG): container finished" podID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerID="acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8" exitCode=0 Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.219544 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmzj7" event={"ID":"c7a5b254-dcc2-492a-be46-d40c8ad5eca1","Type":"ContainerDied","Data":"acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8"} Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.219648 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmzj7" event={"ID":"c7a5b254-dcc2-492a-be46-d40c8ad5eca1","Type":"ContainerDied","Data":"05e3da3fe37e8730e422cff3c1cc66954083eff64ecc9530d9f9aca4a6846bde"} Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.219675 4585 scope.go:117] "RemoveContainer" containerID="acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.221051 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmzj7" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.241040 4585 scope.go:117] "RemoveContainer" containerID="16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.289317 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qmzj7"] Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.292704 4585 scope.go:117] "RemoveContainer" containerID="080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.300137 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qmzj7"] Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.319183 4585 scope.go:117] "RemoveContainer" containerID="acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8" Feb 15 17:42:32 crc kubenswrapper[4585]: E0215 17:42:32.319764 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8\": container with ID starting with acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8 not found: ID does not exist" containerID="acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.319886 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8"} err="failed to get container status \"acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8\": rpc error: code = NotFound desc = could not find container \"acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8\": container with ID starting with acfadb4ca832d621eaac753972fdbe54633c3a620038c56b0ffad66c457ec8d8 not found: ID does not exist" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.319981 4585 scope.go:117] "RemoveContainer" containerID="16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b" Feb 15 17:42:32 crc kubenswrapper[4585]: E0215 17:42:32.321419 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b\": container with ID starting with 16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b not found: ID does not exist" containerID="16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.321456 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b"} err="failed to get container status \"16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b\": rpc error: code = NotFound desc = could not find container \"16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b\": container with ID starting with 16187dab2868602ff6699a91f69a3ba098bea9d8aafdda55f17d0d67aae4fb0b not found: ID does not exist" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.321478 4585 scope.go:117] "RemoveContainer" containerID="080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f" Feb 15 17:42:32 crc kubenswrapper[4585]: E0215 17:42:32.322143 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f\": container with ID starting with 080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f not found: ID does not exist" containerID="080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.322201 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f"} err="failed to get container status \"080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f\": rpc error: code = NotFound desc = could not find container \"080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f\": container with ID starting with 080def6e4f1cdc5af2bcc208abe1a287ac6fdc4c0d298bca46dd33f54960804f not found: ID does not exist" Feb 15 17:42:32 crc kubenswrapper[4585]: I0215 17:42:32.852850 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" path="/var/lib/kubelet/pods/c7a5b254-dcc2-492a-be46-d40c8ad5eca1/volumes" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.485008 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5r6df"] Feb 15 17:42:34 crc kubenswrapper[4585]: E0215 17:42:34.485439 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="registry-server" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.485450 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="registry-server" Feb 15 17:42:34 crc kubenswrapper[4585]: E0215 17:42:34.485461 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerName="registry-server" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.485469 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerName="registry-server" Feb 15 17:42:34 crc kubenswrapper[4585]: E0215 17:42:34.485487 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerName="extract-utilities" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.485492 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerName="extract-utilities" Feb 15 17:42:34 crc kubenswrapper[4585]: E0215 17:42:34.485510 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="extract-content" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.485516 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="extract-content" Feb 15 17:42:34 crc kubenswrapper[4585]: E0215 17:42:34.485528 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="extract-utilities" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.485534 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="extract-utilities" Feb 15 17:42:34 crc kubenswrapper[4585]: E0215 17:42:34.485548 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerName="extract-content" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.485554 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerName="extract-content" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.485764 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7a5b254-dcc2-492a-be46-d40c8ad5eca1" containerName="registry-server" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.485780 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c7880b-3e09-4295-8d8b-6145fa7e887f" containerName="registry-server" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.487327 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5r6df"] Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.487400 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.647835 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grd6b\" (UniqueName: \"kubernetes.io/projected/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-kube-api-access-grd6b\") pod \"redhat-marketplace-5r6df\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.648166 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-utilities\") pod \"redhat-marketplace-5r6df\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.648218 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-catalog-content\") pod \"redhat-marketplace-5r6df\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.750223 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-utilities\") pod \"redhat-marketplace-5r6df\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.750304 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-catalog-content\") pod \"redhat-marketplace-5r6df\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.750359 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grd6b\" (UniqueName: \"kubernetes.io/projected/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-kube-api-access-grd6b\") pod \"redhat-marketplace-5r6df\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.750865 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-utilities\") pod \"redhat-marketplace-5r6df\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.750869 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-catalog-content\") pod \"redhat-marketplace-5r6df\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.772709 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grd6b\" (UniqueName: \"kubernetes.io/projected/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-kube-api-access-grd6b\") pod \"redhat-marketplace-5r6df\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:34 crc kubenswrapper[4585]: I0215 17:42:34.823319 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:35 crc kubenswrapper[4585]: I0215 17:42:35.312235 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5r6df"] Feb 15 17:42:36 crc kubenswrapper[4585]: I0215 17:42:36.274504 4585 generic.go:334] "Generic (PLEG): container finished" podID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerID="89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d" exitCode=0 Feb 15 17:42:36 crc kubenswrapper[4585]: I0215 17:42:36.274814 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5r6df" event={"ID":"f569857c-ad8b-4a0f-ad11-6846ae2d61e3","Type":"ContainerDied","Data":"89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d"} Feb 15 17:42:36 crc kubenswrapper[4585]: I0215 17:42:36.274890 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5r6df" event={"ID":"f569857c-ad8b-4a0f-ad11-6846ae2d61e3","Type":"ContainerStarted","Data":"cd5c8e7d1ff32a7aa54f72d9c8fab9c4342db0ff710c01ef286edef9ad5d0af7"} Feb 15 17:42:37 crc kubenswrapper[4585]: I0215 17:42:37.290834 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5r6df" event={"ID":"f569857c-ad8b-4a0f-ad11-6846ae2d61e3","Type":"ContainerStarted","Data":"24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52"} Feb 15 17:42:38 crc kubenswrapper[4585]: I0215 17:42:38.303283 4585 generic.go:334] "Generic (PLEG): container finished" podID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerID="24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52" exitCode=0 Feb 15 17:42:38 crc kubenswrapper[4585]: I0215 17:42:38.303634 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5r6df" event={"ID":"f569857c-ad8b-4a0f-ad11-6846ae2d61e3","Type":"ContainerDied","Data":"24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52"} Feb 15 17:42:39 crc kubenswrapper[4585]: I0215 17:42:39.317890 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5r6df" event={"ID":"f569857c-ad8b-4a0f-ad11-6846ae2d61e3","Type":"ContainerStarted","Data":"20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a"} Feb 15 17:42:39 crc kubenswrapper[4585]: I0215 17:42:39.346551 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5r6df" podStartSLOduration=2.915001977 podStartE2EDuration="5.346534309s" podCreationTimestamp="2026-02-15 17:42:34 +0000 UTC" firstStartedPulling="2026-02-15 17:42:36.276712148 +0000 UTC m=+2212.220120290" lastFinishedPulling="2026-02-15 17:42:38.70824446 +0000 UTC m=+2214.651652622" observedRunningTime="2026-02-15 17:42:39.343279412 +0000 UTC m=+2215.286687544" watchObservedRunningTime="2026-02-15 17:42:39.346534309 +0000 UTC m=+2215.289942441" Feb 15 17:42:44 crc kubenswrapper[4585]: I0215 17:42:44.824154 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:44 crc kubenswrapper[4585]: I0215 17:42:44.824988 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:44 crc kubenswrapper[4585]: I0215 17:42:44.852478 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:42:44 crc kubenswrapper[4585]: E0215 17:42:44.853083 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:42:44 crc kubenswrapper[4585]: I0215 17:42:44.888448 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:45 crc kubenswrapper[4585]: I0215 17:42:45.461737 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:45 crc kubenswrapper[4585]: I0215 17:42:45.512183 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5r6df"] Feb 15 17:42:47 crc kubenswrapper[4585]: I0215 17:42:47.415348 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5r6df" podUID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerName="registry-server" containerID="cri-o://20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a" gracePeriod=2 Feb 15 17:42:47 crc kubenswrapper[4585]: I0215 17:42:47.926163 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.042936 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-catalog-content\") pod \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.043281 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-utilities\") pod \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.043439 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grd6b\" (UniqueName: \"kubernetes.io/projected/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-kube-api-access-grd6b\") pod \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\" (UID: \"f569857c-ad8b-4a0f-ad11-6846ae2d61e3\") " Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.044292 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-utilities" (OuterVolumeSpecName: "utilities") pod "f569857c-ad8b-4a0f-ad11-6846ae2d61e3" (UID: "f569857c-ad8b-4a0f-ad11-6846ae2d61e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.052106 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-kube-api-access-grd6b" (OuterVolumeSpecName: "kube-api-access-grd6b") pod "f569857c-ad8b-4a0f-ad11-6846ae2d61e3" (UID: "f569857c-ad8b-4a0f-ad11-6846ae2d61e3"). InnerVolumeSpecName "kube-api-access-grd6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.072171 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f569857c-ad8b-4a0f-ad11-6846ae2d61e3" (UID: "f569857c-ad8b-4a0f-ad11-6846ae2d61e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.146568 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grd6b\" (UniqueName: \"kubernetes.io/projected/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-kube-api-access-grd6b\") on node \"crc\" DevicePath \"\"" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.146815 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.146898 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f569857c-ad8b-4a0f-ad11-6846ae2d61e3-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.426693 4585 generic.go:334] "Generic (PLEG): container finished" podID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerID="20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a" exitCode=0 Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.426757 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5r6df" event={"ID":"f569857c-ad8b-4a0f-ad11-6846ae2d61e3","Type":"ContainerDied","Data":"20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a"} Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.426793 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5r6df" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.426807 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5r6df" event={"ID":"f569857c-ad8b-4a0f-ad11-6846ae2d61e3","Type":"ContainerDied","Data":"cd5c8e7d1ff32a7aa54f72d9c8fab9c4342db0ff710c01ef286edef9ad5d0af7"} Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.426820 4585 scope.go:117] "RemoveContainer" containerID="20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.479052 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5r6df"] Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.485840 4585 scope.go:117] "RemoveContainer" containerID="24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.492510 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5r6df"] Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.532146 4585 scope.go:117] "RemoveContainer" containerID="89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.582370 4585 scope.go:117] "RemoveContainer" containerID="20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a" Feb 15 17:42:48 crc kubenswrapper[4585]: E0215 17:42:48.582856 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a\": container with ID starting with 20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a not found: ID does not exist" containerID="20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.582890 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a"} err="failed to get container status \"20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a\": rpc error: code = NotFound desc = could not find container \"20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a\": container with ID starting with 20841b24f4a778d01f3db9d2e101f271e328fea799be40b86753d0988116143a not found: ID does not exist" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.582910 4585 scope.go:117] "RemoveContainer" containerID="24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52" Feb 15 17:42:48 crc kubenswrapper[4585]: E0215 17:42:48.583784 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52\": container with ID starting with 24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52 not found: ID does not exist" containerID="24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.583810 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52"} err="failed to get container status \"24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52\": rpc error: code = NotFound desc = could not find container \"24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52\": container with ID starting with 24fbd24746908270852b442d9cffde177319f4ee67f47d14adb16b6119c85b52 not found: ID does not exist" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.583826 4585 scope.go:117] "RemoveContainer" containerID="89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d" Feb 15 17:42:48 crc kubenswrapper[4585]: E0215 17:42:48.584873 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d\": container with ID starting with 89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d not found: ID does not exist" containerID="89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.584928 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d"} err="failed to get container status \"89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d\": rpc error: code = NotFound desc = could not find container \"89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d\": container with ID starting with 89c9ec16f136c312ce1dac31c5df3ecef6103757d9e768986db480cd03963e3d not found: ID does not exist" Feb 15 17:42:48 crc kubenswrapper[4585]: I0215 17:42:48.874274 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" path="/var/lib/kubelet/pods/f569857c-ad8b-4a0f-ad11-6846ae2d61e3/volumes" Feb 15 17:42:58 crc kubenswrapper[4585]: I0215 17:42:58.841233 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:42:58 crc kubenswrapper[4585]: E0215 17:42:58.842017 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:43:09 crc kubenswrapper[4585]: I0215 17:43:09.842837 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:43:09 crc kubenswrapper[4585]: E0215 17:43:09.843826 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:43:24 crc kubenswrapper[4585]: I0215 17:43:24.857350 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:43:24 crc kubenswrapper[4585]: E0215 17:43:24.858061 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:43:37 crc kubenswrapper[4585]: I0215 17:43:37.841644 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:43:37 crc kubenswrapper[4585]: E0215 17:43:37.842415 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:43:52 crc kubenswrapper[4585]: I0215 17:43:52.842824 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:43:52 crc kubenswrapper[4585]: E0215 17:43:52.843547 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:44:03 crc kubenswrapper[4585]: I0215 17:44:03.842389 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:44:03 crc kubenswrapper[4585]: E0215 17:44:03.843446 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:44:18 crc kubenswrapper[4585]: I0215 17:44:18.842435 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:44:18 crc kubenswrapper[4585]: E0215 17:44:18.843089 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:44:29 crc kubenswrapper[4585]: I0215 17:44:29.841501 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:44:29 crc kubenswrapper[4585]: E0215 17:44:29.842701 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:44:40 crc kubenswrapper[4585]: I0215 17:44:40.841478 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:44:40 crc kubenswrapper[4585]: E0215 17:44:40.842293 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:44:55 crc kubenswrapper[4585]: I0215 17:44:55.858283 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:44:55 crc kubenswrapper[4585]: E0215 17:44:55.859104 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.192245 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48"] Feb 15 17:45:00 crc kubenswrapper[4585]: E0215 17:45:00.195624 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerName="registry-server" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.195723 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerName="registry-server" Feb 15 17:45:00 crc kubenswrapper[4585]: E0215 17:45:00.195861 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerName="extract-content" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.195920 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerName="extract-content" Feb 15 17:45:00 crc kubenswrapper[4585]: E0215 17:45:00.195995 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerName="extract-utilities" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.196059 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerName="extract-utilities" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.196331 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f569857c-ad8b-4a0f-ad11-6846ae2d61e3" containerName="registry-server" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.197255 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.201814 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.202698 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.211624 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48"] Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.299324 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-config-volume\") pod \"collect-profiles-29519625-4ss48\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.299376 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmtnf\" (UniqueName: \"kubernetes.io/projected/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-kube-api-access-cmtnf\") pod \"collect-profiles-29519625-4ss48\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.299508 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-secret-volume\") pod \"collect-profiles-29519625-4ss48\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.402586 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-secret-volume\") pod \"collect-profiles-29519625-4ss48\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.402860 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-config-volume\") pod \"collect-profiles-29519625-4ss48\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.402916 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmtnf\" (UniqueName: \"kubernetes.io/projected/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-kube-api-access-cmtnf\") pod \"collect-profiles-29519625-4ss48\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.404534 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-config-volume\") pod \"collect-profiles-29519625-4ss48\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.408366 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-secret-volume\") pod \"collect-profiles-29519625-4ss48\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.423236 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmtnf\" (UniqueName: \"kubernetes.io/projected/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-kube-api-access-cmtnf\") pod \"collect-profiles-29519625-4ss48\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:00 crc kubenswrapper[4585]: I0215 17:45:00.520883 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:01 crc kubenswrapper[4585]: I0215 17:45:01.111146 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48"] Feb 15 17:45:01 crc kubenswrapper[4585]: W0215 17:45:01.125885 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod513e70ea_db0b_4b29_ba5b_eac62c7a52ba.slice/crio-34427ac1ec3d7c54cd46d259693c930211726458e02f8804352c277017a87944 WatchSource:0}: Error finding container 34427ac1ec3d7c54cd46d259693c930211726458e02f8804352c277017a87944: Status 404 returned error can't find the container with id 34427ac1ec3d7c54cd46d259693c930211726458e02f8804352c277017a87944 Feb 15 17:45:02 crc kubenswrapper[4585]: I0215 17:45:02.047680 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" event={"ID":"513e70ea-db0b-4b29-ba5b-eac62c7a52ba","Type":"ContainerDied","Data":"f3ad4d924d0b109859b4145c24fd59a43149ef8727bdb606fcfbb29a02f11862"} Feb 15 17:45:02 crc kubenswrapper[4585]: I0215 17:45:02.047528 4585 generic.go:334] "Generic (PLEG): container finished" podID="513e70ea-db0b-4b29-ba5b-eac62c7a52ba" containerID="f3ad4d924d0b109859b4145c24fd59a43149ef8727bdb606fcfbb29a02f11862" exitCode=0 Feb 15 17:45:02 crc kubenswrapper[4585]: I0215 17:45:02.048145 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" event={"ID":"513e70ea-db0b-4b29-ba5b-eac62c7a52ba","Type":"ContainerStarted","Data":"34427ac1ec3d7c54cd46d259693c930211726458e02f8804352c277017a87944"} Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.512167 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.593881 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-secret-volume\") pod \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.594417 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-config-volume\") pod \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.594684 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmtnf\" (UniqueName: \"kubernetes.io/projected/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-kube-api-access-cmtnf\") pod \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\" (UID: \"513e70ea-db0b-4b29-ba5b-eac62c7a52ba\") " Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.606336 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "513e70ea-db0b-4b29-ba5b-eac62c7a52ba" (UID: "513e70ea-db0b-4b29-ba5b-eac62c7a52ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.621084 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "513e70ea-db0b-4b29-ba5b-eac62c7a52ba" (UID: "513e70ea-db0b-4b29-ba5b-eac62c7a52ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.622052 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-kube-api-access-cmtnf" (OuterVolumeSpecName: "kube-api-access-cmtnf") pod "513e70ea-db0b-4b29-ba5b-eac62c7a52ba" (UID: "513e70ea-db0b-4b29-ba5b-eac62c7a52ba"). InnerVolumeSpecName "kube-api-access-cmtnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.698118 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.698389 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-config-volume\") on node \"crc\" DevicePath \"\"" Feb 15 17:45:03 crc kubenswrapper[4585]: I0215 17:45:03.698489 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmtnf\" (UniqueName: \"kubernetes.io/projected/513e70ea-db0b-4b29-ba5b-eac62c7a52ba-kube-api-access-cmtnf\") on node \"crc\" DevicePath \"\"" Feb 15 17:45:04 crc kubenswrapper[4585]: I0215 17:45:04.068423 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" event={"ID":"513e70ea-db0b-4b29-ba5b-eac62c7a52ba","Type":"ContainerDied","Data":"34427ac1ec3d7c54cd46d259693c930211726458e02f8804352c277017a87944"} Feb 15 17:45:04 crc kubenswrapper[4585]: I0215 17:45:04.068482 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34427ac1ec3d7c54cd46d259693c930211726458e02f8804352c277017a87944" Feb 15 17:45:04 crc kubenswrapper[4585]: I0215 17:45:04.068541 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519625-4ss48" Feb 15 17:45:04 crc kubenswrapper[4585]: I0215 17:45:04.616636 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws"] Feb 15 17:45:04 crc kubenswrapper[4585]: I0215 17:45:04.627909 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519580-zkkws"] Feb 15 17:45:04 crc kubenswrapper[4585]: I0215 17:45:04.857950 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b450dc-9948-4b88-b099-3d1aebf653d3" path="/var/lib/kubelet/pods/b6b450dc-9948-4b88-b099-3d1aebf653d3/volumes" Feb 15 17:45:09 crc kubenswrapper[4585]: I0215 17:45:09.841838 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:45:09 crc kubenswrapper[4585]: E0215 17:45:09.842488 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:45:22 crc kubenswrapper[4585]: I0215 17:45:22.843348 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:45:22 crc kubenswrapper[4585]: E0215 17:45:22.844513 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:45:36 crc kubenswrapper[4585]: I0215 17:45:36.841937 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:45:36 crc kubenswrapper[4585]: E0215 17:45:36.843521 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:45:49 crc kubenswrapper[4585]: I0215 17:45:49.844118 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:45:49 crc kubenswrapper[4585]: E0215 17:45:49.845067 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:45:50 crc kubenswrapper[4585]: I0215 17:45:50.691754 4585 scope.go:117] "RemoveContainer" containerID="7d44e65eb96a39c541d4c38939ecc3f089cbbd92469a65573e57d7d52f200d86" Feb 15 17:46:01 crc kubenswrapper[4585]: I0215 17:46:01.843569 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:46:01 crc kubenswrapper[4585]: E0215 17:46:01.844546 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:46:13 crc kubenswrapper[4585]: I0215 17:46:13.842468 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:46:13 crc kubenswrapper[4585]: E0215 17:46:13.843099 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:46:26 crc kubenswrapper[4585]: I0215 17:46:26.843031 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:46:26 crc kubenswrapper[4585]: E0215 17:46:26.844188 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:46:37 crc kubenswrapper[4585]: I0215 17:46:37.842149 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:46:37 crc kubenswrapper[4585]: E0215 17:46:37.843159 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:46:48 crc kubenswrapper[4585]: I0215 17:46:48.842302 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:46:48 crc kubenswrapper[4585]: E0215 17:46:48.843668 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:47:01 crc kubenswrapper[4585]: I0215 17:47:01.842127 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:47:01 crc kubenswrapper[4585]: E0215 17:47:01.842956 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:47:12 crc kubenswrapper[4585]: I0215 17:47:12.841950 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:47:12 crc kubenswrapper[4585]: E0215 17:47:12.842668 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:47:26 crc kubenswrapper[4585]: I0215 17:47:26.842454 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:47:27 crc kubenswrapper[4585]: I0215 17:47:27.943869 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"707e2cabdae5fd4224842942496f67e0a37a5dc60a81ac67d588c49fa31af510"} Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.475167 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bjfbz"] Feb 15 17:48:59 crc kubenswrapper[4585]: E0215 17:48:59.476468 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513e70ea-db0b-4b29-ba5b-eac62c7a52ba" containerName="collect-profiles" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.476486 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="513e70ea-db0b-4b29-ba5b-eac62c7a52ba" containerName="collect-profiles" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.476809 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="513e70ea-db0b-4b29-ba5b-eac62c7a52ba" containerName="collect-profiles" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.478650 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.492843 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bjfbz"] Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.616792 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-utilities\") pod \"certified-operators-bjfbz\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.617166 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2k8\" (UniqueName: \"kubernetes.io/projected/c53d0b62-1fa6-4a45-8bdd-7cb028409306-kube-api-access-nr2k8\") pod \"certified-operators-bjfbz\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.617212 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-catalog-content\") pod \"certified-operators-bjfbz\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.720263 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2k8\" (UniqueName: \"kubernetes.io/projected/c53d0b62-1fa6-4a45-8bdd-7cb028409306-kube-api-access-nr2k8\") pod \"certified-operators-bjfbz\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.720333 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-catalog-content\") pod \"certified-operators-bjfbz\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.720467 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-utilities\") pod \"certified-operators-bjfbz\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.721333 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-utilities\") pod \"certified-operators-bjfbz\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.721948 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-catalog-content\") pod \"certified-operators-bjfbz\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.745866 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2k8\" (UniqueName: \"kubernetes.io/projected/c53d0b62-1fa6-4a45-8bdd-7cb028409306-kube-api-access-nr2k8\") pod \"certified-operators-bjfbz\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:48:59 crc kubenswrapper[4585]: I0215 17:48:59.819773 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:49:00 crc kubenswrapper[4585]: I0215 17:49:00.392206 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bjfbz"] Feb 15 17:49:01 crc kubenswrapper[4585]: I0215 17:49:01.310678 4585 generic.go:334] "Generic (PLEG): container finished" podID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerID="e5ec139502d5720f530c7f30f678a923a51587ddeb09d1904a3ba026512bf3bf" exitCode=0 Feb 15 17:49:01 crc kubenswrapper[4585]: I0215 17:49:01.310913 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjfbz" event={"ID":"c53d0b62-1fa6-4a45-8bdd-7cb028409306","Type":"ContainerDied","Data":"e5ec139502d5720f530c7f30f678a923a51587ddeb09d1904a3ba026512bf3bf"} Feb 15 17:49:01 crc kubenswrapper[4585]: I0215 17:49:01.310972 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjfbz" event={"ID":"c53d0b62-1fa6-4a45-8bdd-7cb028409306","Type":"ContainerStarted","Data":"9f72d5d53dc8e7037b151b890f44c882b16b0f08482bcb5f966bff350235bf9d"} Feb 15 17:49:01 crc kubenswrapper[4585]: I0215 17:49:01.313871 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 15 17:49:02 crc kubenswrapper[4585]: I0215 17:49:02.329625 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjfbz" event={"ID":"c53d0b62-1fa6-4a45-8bdd-7cb028409306","Type":"ContainerStarted","Data":"676419aa59687978e6be8a03a219060bdb80ac60ca1bd735adc94c7719d56db1"} Feb 15 17:49:04 crc kubenswrapper[4585]: I0215 17:49:04.357883 4585 generic.go:334] "Generic (PLEG): container finished" podID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerID="676419aa59687978e6be8a03a219060bdb80ac60ca1bd735adc94c7719d56db1" exitCode=0 Feb 15 17:49:04 crc kubenswrapper[4585]: I0215 17:49:04.357990 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjfbz" event={"ID":"c53d0b62-1fa6-4a45-8bdd-7cb028409306","Type":"ContainerDied","Data":"676419aa59687978e6be8a03a219060bdb80ac60ca1bd735adc94c7719d56db1"} Feb 15 17:49:05 crc kubenswrapper[4585]: I0215 17:49:05.381371 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjfbz" event={"ID":"c53d0b62-1fa6-4a45-8bdd-7cb028409306","Type":"ContainerStarted","Data":"37b8e8f92bb215a9f865d2900472bc460cd4a7b803f16e63fa2b8b776429303a"} Feb 15 17:49:05 crc kubenswrapper[4585]: I0215 17:49:05.417313 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bjfbz" podStartSLOduration=2.7219169880000003 podStartE2EDuration="6.417280856s" podCreationTimestamp="2026-02-15 17:48:59 +0000 UTC" firstStartedPulling="2026-02-15 17:49:01.313580805 +0000 UTC m=+2597.256988947" lastFinishedPulling="2026-02-15 17:49:05.008944683 +0000 UTC m=+2600.952352815" observedRunningTime="2026-02-15 17:49:05.404194184 +0000 UTC m=+2601.347602356" watchObservedRunningTime="2026-02-15 17:49:05.417280856 +0000 UTC m=+2601.360689028" Feb 15 17:49:09 crc kubenswrapper[4585]: I0215 17:49:09.821761 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:49:09 crc kubenswrapper[4585]: I0215 17:49:09.822292 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:49:09 crc kubenswrapper[4585]: I0215 17:49:09.878979 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:49:10 crc kubenswrapper[4585]: I0215 17:49:10.515864 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:49:13 crc kubenswrapper[4585]: I0215 17:49:13.444495 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bjfbz"] Feb 15 17:49:13 crc kubenswrapper[4585]: I0215 17:49:13.445219 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bjfbz" podUID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerName="registry-server" containerID="cri-o://37b8e8f92bb215a9f865d2900472bc460cd4a7b803f16e63fa2b8b776429303a" gracePeriod=2 Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.675086 4585 generic.go:334] "Generic (PLEG): container finished" podID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerID="37b8e8f92bb215a9f865d2900472bc460cd4a7b803f16e63fa2b8b776429303a" exitCode=0 Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.675241 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjfbz" event={"ID":"c53d0b62-1fa6-4a45-8bdd-7cb028409306","Type":"ContainerDied","Data":"37b8e8f92bb215a9f865d2900472bc460cd4a7b803f16e63fa2b8b776429303a"} Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.771344 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.836671 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-utilities\") pod \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.837568 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-utilities" (OuterVolumeSpecName: "utilities") pod "c53d0b62-1fa6-4a45-8bdd-7cb028409306" (UID: "c53d0b62-1fa6-4a45-8bdd-7cb028409306"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.837636 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-catalog-content\") pod \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.837685 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr2k8\" (UniqueName: \"kubernetes.io/projected/c53d0b62-1fa6-4a45-8bdd-7cb028409306-kube-api-access-nr2k8\") pod \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\" (UID: \"c53d0b62-1fa6-4a45-8bdd-7cb028409306\") " Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.839536 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.852802 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53d0b62-1fa6-4a45-8bdd-7cb028409306-kube-api-access-nr2k8" (OuterVolumeSpecName: "kube-api-access-nr2k8") pod "c53d0b62-1fa6-4a45-8bdd-7cb028409306" (UID: "c53d0b62-1fa6-4a45-8bdd-7cb028409306"). InnerVolumeSpecName "kube-api-access-nr2k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.911344 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c53d0b62-1fa6-4a45-8bdd-7cb028409306" (UID: "c53d0b62-1fa6-4a45-8bdd-7cb028409306"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.941998 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr2k8\" (UniqueName: \"kubernetes.io/projected/c53d0b62-1fa6-4a45-8bdd-7cb028409306-kube-api-access-nr2k8\") on node \"crc\" DevicePath \"\"" Feb 15 17:49:14 crc kubenswrapper[4585]: I0215 17:49:14.942031 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c53d0b62-1fa6-4a45-8bdd-7cb028409306-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:49:15 crc kubenswrapper[4585]: I0215 17:49:15.695876 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjfbz" event={"ID":"c53d0b62-1fa6-4a45-8bdd-7cb028409306","Type":"ContainerDied","Data":"9f72d5d53dc8e7037b151b890f44c882b16b0f08482bcb5f966bff350235bf9d"} Feb 15 17:49:15 crc kubenswrapper[4585]: I0215 17:49:15.695967 4585 scope.go:117] "RemoveContainer" containerID="37b8e8f92bb215a9f865d2900472bc460cd4a7b803f16e63fa2b8b776429303a" Feb 15 17:49:15 crc kubenswrapper[4585]: I0215 17:49:15.696011 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjfbz" Feb 15 17:49:15 crc kubenswrapper[4585]: I0215 17:49:15.737404 4585 scope.go:117] "RemoveContainer" containerID="676419aa59687978e6be8a03a219060bdb80ac60ca1bd735adc94c7719d56db1" Feb 15 17:49:15 crc kubenswrapper[4585]: I0215 17:49:15.748269 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bjfbz"] Feb 15 17:49:15 crc kubenswrapper[4585]: I0215 17:49:15.779607 4585 scope.go:117] "RemoveContainer" containerID="e5ec139502d5720f530c7f30f678a923a51587ddeb09d1904a3ba026512bf3bf" Feb 15 17:49:15 crc kubenswrapper[4585]: I0215 17:49:15.794567 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bjfbz"] Feb 15 17:49:16 crc kubenswrapper[4585]: I0215 17:49:16.891803 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" path="/var/lib/kubelet/pods/c53d0b62-1fa6-4a45-8bdd-7cb028409306/volumes" Feb 15 17:49:47 crc kubenswrapper[4585]: I0215 17:49:47.014319 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:49:47 crc kubenswrapper[4585]: I0215 17:49:47.014910 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:50:17 crc kubenswrapper[4585]: I0215 17:50:17.013743 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:50:17 crc kubenswrapper[4585]: I0215 17:50:17.014238 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:50:47 crc kubenswrapper[4585]: I0215 17:50:47.014494 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:50:47 crc kubenswrapper[4585]: I0215 17:50:47.015041 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:50:47 crc kubenswrapper[4585]: I0215 17:50:47.015106 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:50:47 crc kubenswrapper[4585]: I0215 17:50:47.016023 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"707e2cabdae5fd4224842942496f67e0a37a5dc60a81ac67d588c49fa31af510"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:50:47 crc kubenswrapper[4585]: I0215 17:50:47.016101 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://707e2cabdae5fd4224842942496f67e0a37a5dc60a81ac67d588c49fa31af510" gracePeriod=600 Feb 15 17:50:47 crc kubenswrapper[4585]: I0215 17:50:47.780678 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="707e2cabdae5fd4224842942496f67e0a37a5dc60a81ac67d588c49fa31af510" exitCode=0 Feb 15 17:50:47 crc kubenswrapper[4585]: I0215 17:50:47.781190 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"707e2cabdae5fd4224842942496f67e0a37a5dc60a81ac67d588c49fa31af510"} Feb 15 17:50:47 crc kubenswrapper[4585]: I0215 17:50:47.781221 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44"} Feb 15 17:50:47 crc kubenswrapper[4585]: I0215 17:50:47.781238 4585 scope.go:117] "RemoveContainer" containerID="03f19251944c1e70a59bc4a90a43a7f1e1af7415884bb541def45a46000bed37" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.883215 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8bz9w/must-gather-47drt"] Feb 15 17:50:55 crc kubenswrapper[4585]: E0215 17:50:55.884060 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerName="registry-server" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.884071 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerName="registry-server" Feb 15 17:50:55 crc kubenswrapper[4585]: E0215 17:50:55.884094 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerName="extract-content" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.884099 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerName="extract-content" Feb 15 17:50:55 crc kubenswrapper[4585]: E0215 17:50:55.884123 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerName="extract-utilities" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.884129 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerName="extract-utilities" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.884344 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53d0b62-1fa6-4a45-8bdd-7cb028409306" containerName="registry-server" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.885959 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.892879 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8bz9w"/"default-dockercfg-md7fx" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.893087 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8bz9w"/"kube-root-ca.crt" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.898270 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8bz9w"/"openshift-service-ca.crt" Feb 15 17:50:55 crc kubenswrapper[4585]: I0215 17:50:55.900744 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bz9w/must-gather-47drt"] Feb 15 17:50:56 crc kubenswrapper[4585]: I0215 17:50:56.073482 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-must-gather-output\") pod \"must-gather-47drt\" (UID: \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\") " pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:50:56 crc kubenswrapper[4585]: I0215 17:50:56.074737 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29zqr\" (UniqueName: \"kubernetes.io/projected/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-kube-api-access-29zqr\") pod \"must-gather-47drt\" (UID: \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\") " pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:50:56 crc kubenswrapper[4585]: I0215 17:50:56.176084 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29zqr\" (UniqueName: \"kubernetes.io/projected/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-kube-api-access-29zqr\") pod \"must-gather-47drt\" (UID: \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\") " pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:50:56 crc kubenswrapper[4585]: I0215 17:50:56.176269 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-must-gather-output\") pod \"must-gather-47drt\" (UID: \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\") " pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:50:56 crc kubenswrapper[4585]: I0215 17:50:56.176629 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-must-gather-output\") pod \"must-gather-47drt\" (UID: \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\") " pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:50:56 crc kubenswrapper[4585]: I0215 17:50:56.200408 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29zqr\" (UniqueName: \"kubernetes.io/projected/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-kube-api-access-29zqr\") pod \"must-gather-47drt\" (UID: \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\") " pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:50:56 crc kubenswrapper[4585]: I0215 17:50:56.211705 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:50:56 crc kubenswrapper[4585]: I0215 17:50:56.732310 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8bz9w/must-gather-47drt"] Feb 15 17:50:56 crc kubenswrapper[4585]: I0215 17:50:56.903259 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bz9w/must-gather-47drt" event={"ID":"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6","Type":"ContainerStarted","Data":"085d1886f6e80cd6ea6664028ed2030d37287c5983a8209e7977d746a9af6638"} Feb 15 17:51:06 crc kubenswrapper[4585]: I0215 17:51:06.003122 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bz9w/must-gather-47drt" event={"ID":"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6","Type":"ContainerStarted","Data":"f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284"} Feb 15 17:51:07 crc kubenswrapper[4585]: I0215 17:51:07.013893 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bz9w/must-gather-47drt" event={"ID":"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6","Type":"ContainerStarted","Data":"390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665"} Feb 15 17:51:07 crc kubenswrapper[4585]: I0215 17:51:07.033341 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8bz9w/must-gather-47drt" podStartSLOduration=3.233814223 podStartE2EDuration="12.033328092s" podCreationTimestamp="2026-02-15 17:50:55 +0000 UTC" firstStartedPulling="2026-02-15 17:50:56.735774202 +0000 UTC m=+2712.679182344" lastFinishedPulling="2026-02-15 17:51:05.535288071 +0000 UTC m=+2721.478696213" observedRunningTime="2026-02-15 17:51:07.030738412 +0000 UTC m=+2722.974146544" watchObservedRunningTime="2026-02-15 17:51:07.033328092 +0000 UTC m=+2722.976736224" Feb 15 17:51:11 crc kubenswrapper[4585]: I0215 17:51:11.263494 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8bz9w/crc-debug-d2wgk"] Feb 15 17:51:11 crc kubenswrapper[4585]: I0215 17:51:11.265199 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:11 crc kubenswrapper[4585]: I0215 17:51:11.449138 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrv4f\" (UniqueName: \"kubernetes.io/projected/2172430f-66dd-4971-813c-700b51390b9f-kube-api-access-wrv4f\") pod \"crc-debug-d2wgk\" (UID: \"2172430f-66dd-4971-813c-700b51390b9f\") " pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:11 crc kubenswrapper[4585]: I0215 17:51:11.449415 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2172430f-66dd-4971-813c-700b51390b9f-host\") pod \"crc-debug-d2wgk\" (UID: \"2172430f-66dd-4971-813c-700b51390b9f\") " pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:11 crc kubenswrapper[4585]: I0215 17:51:11.550666 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrv4f\" (UniqueName: \"kubernetes.io/projected/2172430f-66dd-4971-813c-700b51390b9f-kube-api-access-wrv4f\") pod \"crc-debug-d2wgk\" (UID: \"2172430f-66dd-4971-813c-700b51390b9f\") " pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:11 crc kubenswrapper[4585]: I0215 17:51:11.550720 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2172430f-66dd-4971-813c-700b51390b9f-host\") pod \"crc-debug-d2wgk\" (UID: \"2172430f-66dd-4971-813c-700b51390b9f\") " pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:11 crc kubenswrapper[4585]: I0215 17:51:11.550908 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2172430f-66dd-4971-813c-700b51390b9f-host\") pod \"crc-debug-d2wgk\" (UID: \"2172430f-66dd-4971-813c-700b51390b9f\") " pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:11 crc kubenswrapper[4585]: I0215 17:51:11.587813 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrv4f\" (UniqueName: \"kubernetes.io/projected/2172430f-66dd-4971-813c-700b51390b9f-kube-api-access-wrv4f\") pod \"crc-debug-d2wgk\" (UID: \"2172430f-66dd-4971-813c-700b51390b9f\") " pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:11 crc kubenswrapper[4585]: I0215 17:51:11.882316 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:12 crc kubenswrapper[4585]: I0215 17:51:12.056885 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" event={"ID":"2172430f-66dd-4971-813c-700b51390b9f","Type":"ContainerStarted","Data":"223e25517d19a690a6e22a2457a6e56e4c86deb7cac83d691b8bb5ab888983ac"} Feb 15 17:51:26 crc kubenswrapper[4585]: I0215 17:51:26.240355 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" event={"ID":"2172430f-66dd-4971-813c-700b51390b9f","Type":"ContainerStarted","Data":"18e7c3ac9e22dec8e7d92e03d4124b80ca4f9ba9d4e3405b20896bfa68821be6"} Feb 15 17:51:26 crc kubenswrapper[4585]: I0215 17:51:26.263471 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" podStartSLOduration=1.449552414 podStartE2EDuration="15.263454329s" podCreationTimestamp="2026-02-15 17:51:11 +0000 UTC" firstStartedPulling="2026-02-15 17:51:11.915373368 +0000 UTC m=+2727.858781500" lastFinishedPulling="2026-02-15 17:51:25.729275283 +0000 UTC m=+2741.672683415" observedRunningTime="2026-02-15 17:51:26.260963833 +0000 UTC m=+2742.204371965" watchObservedRunningTime="2026-02-15 17:51:26.263454329 +0000 UTC m=+2742.206862461" Feb 15 17:51:41 crc kubenswrapper[4585]: I0215 17:51:41.380854 4585 generic.go:334] "Generic (PLEG): container finished" podID="2172430f-66dd-4971-813c-700b51390b9f" containerID="18e7c3ac9e22dec8e7d92e03d4124b80ca4f9ba9d4e3405b20896bfa68821be6" exitCode=0 Feb 15 17:51:41 crc kubenswrapper[4585]: I0215 17:51:41.381039 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" event={"ID":"2172430f-66dd-4971-813c-700b51390b9f","Type":"ContainerDied","Data":"18e7c3ac9e22dec8e7d92e03d4124b80ca4f9ba9d4e3405b20896bfa68821be6"} Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.507482 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.551408 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8bz9w/crc-debug-d2wgk"] Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.561754 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8bz9w/crc-debug-d2wgk"] Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.694888 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2172430f-66dd-4971-813c-700b51390b9f-host\") pod \"2172430f-66dd-4971-813c-700b51390b9f\" (UID: \"2172430f-66dd-4971-813c-700b51390b9f\") " Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.694973 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2172430f-66dd-4971-813c-700b51390b9f-host" (OuterVolumeSpecName: "host") pod "2172430f-66dd-4971-813c-700b51390b9f" (UID: "2172430f-66dd-4971-813c-700b51390b9f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.695086 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrv4f\" (UniqueName: \"kubernetes.io/projected/2172430f-66dd-4971-813c-700b51390b9f-kube-api-access-wrv4f\") pod \"2172430f-66dd-4971-813c-700b51390b9f\" (UID: \"2172430f-66dd-4971-813c-700b51390b9f\") " Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.695586 4585 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2172430f-66dd-4971-813c-700b51390b9f-host\") on node \"crc\" DevicePath \"\"" Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.708775 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2172430f-66dd-4971-813c-700b51390b9f-kube-api-access-wrv4f" (OuterVolumeSpecName: "kube-api-access-wrv4f") pod "2172430f-66dd-4971-813c-700b51390b9f" (UID: "2172430f-66dd-4971-813c-700b51390b9f"). InnerVolumeSpecName "kube-api-access-wrv4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.797859 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrv4f\" (UniqueName: \"kubernetes.io/projected/2172430f-66dd-4971-813c-700b51390b9f-kube-api-access-wrv4f\") on node \"crc\" DevicePath \"\"" Feb 15 17:51:42 crc kubenswrapper[4585]: I0215 17:51:42.850928 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2172430f-66dd-4971-813c-700b51390b9f" path="/var/lib/kubelet/pods/2172430f-66dd-4971-813c-700b51390b9f/volumes" Feb 15 17:51:43 crc kubenswrapper[4585]: I0215 17:51:43.400195 4585 scope.go:117] "RemoveContainer" containerID="18e7c3ac9e22dec8e7d92e03d4124b80ca4f9ba9d4e3405b20896bfa68821be6" Feb 15 17:51:43 crc kubenswrapper[4585]: I0215 17:51:43.400229 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/crc-debug-d2wgk" Feb 15 17:51:43 crc kubenswrapper[4585]: I0215 17:51:43.722586 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8bz9w/crc-debug-5t8zn"] Feb 15 17:51:43 crc kubenswrapper[4585]: E0215 17:51:43.723360 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2172430f-66dd-4971-813c-700b51390b9f" containerName="container-00" Feb 15 17:51:43 crc kubenswrapper[4585]: I0215 17:51:43.723373 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2172430f-66dd-4971-813c-700b51390b9f" containerName="container-00" Feb 15 17:51:43 crc kubenswrapper[4585]: I0215 17:51:43.723650 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="2172430f-66dd-4971-813c-700b51390b9f" containerName="container-00" Feb 15 17:51:43 crc kubenswrapper[4585]: I0215 17:51:43.724387 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:43 crc kubenswrapper[4585]: I0215 17:51:43.920523 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms784\" (UniqueName: \"kubernetes.io/projected/2e85fbc9-8159-4114-b225-a8dc3ace9b13-kube-api-access-ms784\") pod \"crc-debug-5t8zn\" (UID: \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\") " pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:43 crc kubenswrapper[4585]: I0215 17:51:43.920561 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e85fbc9-8159-4114-b225-a8dc3ace9b13-host\") pod \"crc-debug-5t8zn\" (UID: \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\") " pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.022497 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e85fbc9-8159-4114-b225-a8dc3ace9b13-host\") pod \"crc-debug-5t8zn\" (UID: \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\") " pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.022703 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e85fbc9-8159-4114-b225-a8dc3ace9b13-host\") pod \"crc-debug-5t8zn\" (UID: \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\") " pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.022717 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms784\" (UniqueName: \"kubernetes.io/projected/2e85fbc9-8159-4114-b225-a8dc3ace9b13-kube-api-access-ms784\") pod \"crc-debug-5t8zn\" (UID: \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\") " pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.042874 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms784\" (UniqueName: \"kubernetes.io/projected/2e85fbc9-8159-4114-b225-a8dc3ace9b13-kube-api-access-ms784\") pod \"crc-debug-5t8zn\" (UID: \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\") " pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.043431 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.411399 4585 generic.go:334] "Generic (PLEG): container finished" podID="2e85fbc9-8159-4114-b225-a8dc3ace9b13" containerID="8dbd74e5d0c35fbcb62ec7c34f9392214146a60f0262c5c804296e8d19db2b89" exitCode=1 Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.411445 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" event={"ID":"2e85fbc9-8159-4114-b225-a8dc3ace9b13","Type":"ContainerDied","Data":"8dbd74e5d0c35fbcb62ec7c34f9392214146a60f0262c5c804296e8d19db2b89"} Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.411720 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" event={"ID":"2e85fbc9-8159-4114-b225-a8dc3ace9b13","Type":"ContainerStarted","Data":"44b396fc1ad4c10177a82c1e2f5012b7da7397b847957661a7fbf209107c0492"} Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.448425 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8bz9w/crc-debug-5t8zn"] Feb 15 17:51:44 crc kubenswrapper[4585]: I0215 17:51:44.458756 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8bz9w/crc-debug-5t8zn"] Feb 15 17:51:45 crc kubenswrapper[4585]: I0215 17:51:45.520509 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:45 crc kubenswrapper[4585]: I0215 17:51:45.563788 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e85fbc9-8159-4114-b225-a8dc3ace9b13-host\") pod \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\" (UID: \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\") " Feb 15 17:51:45 crc kubenswrapper[4585]: I0215 17:51:45.563862 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms784\" (UniqueName: \"kubernetes.io/projected/2e85fbc9-8159-4114-b225-a8dc3ace9b13-kube-api-access-ms784\") pod \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\" (UID: \"2e85fbc9-8159-4114-b225-a8dc3ace9b13\") " Feb 15 17:51:45 crc kubenswrapper[4585]: I0215 17:51:45.563897 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e85fbc9-8159-4114-b225-a8dc3ace9b13-host" (OuterVolumeSpecName: "host") pod "2e85fbc9-8159-4114-b225-a8dc3ace9b13" (UID: "2e85fbc9-8159-4114-b225-a8dc3ace9b13"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 15 17:51:45 crc kubenswrapper[4585]: I0215 17:51:45.564153 4585 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e85fbc9-8159-4114-b225-a8dc3ace9b13-host\") on node \"crc\" DevicePath \"\"" Feb 15 17:51:45 crc kubenswrapper[4585]: I0215 17:51:45.576406 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e85fbc9-8159-4114-b225-a8dc3ace9b13-kube-api-access-ms784" (OuterVolumeSpecName: "kube-api-access-ms784") pod "2e85fbc9-8159-4114-b225-a8dc3ace9b13" (UID: "2e85fbc9-8159-4114-b225-a8dc3ace9b13"). InnerVolumeSpecName "kube-api-access-ms784". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:51:45 crc kubenswrapper[4585]: I0215 17:51:45.665533 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms784\" (UniqueName: \"kubernetes.io/projected/2e85fbc9-8159-4114-b225-a8dc3ace9b13-kube-api-access-ms784\") on node \"crc\" DevicePath \"\"" Feb 15 17:51:46 crc kubenswrapper[4585]: I0215 17:51:46.445552 4585 scope.go:117] "RemoveContainer" containerID="8dbd74e5d0c35fbcb62ec7c34f9392214146a60f0262c5c804296e8d19db2b89" Feb 15 17:51:46 crc kubenswrapper[4585]: I0215 17:51:46.445720 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/crc-debug-5t8zn" Feb 15 17:51:46 crc kubenswrapper[4585]: I0215 17:51:46.857514 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e85fbc9-8159-4114-b225-a8dc3ace9b13" path="/var/lib/kubelet/pods/2e85fbc9-8159-4114-b225-a8dc3ace9b13/volumes" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.180284 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-skdmv"] Feb 15 17:52:03 crc kubenswrapper[4585]: E0215 17:52:03.181167 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e85fbc9-8159-4114-b225-a8dc3ace9b13" containerName="container-00" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.181179 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e85fbc9-8159-4114-b225-a8dc3ace9b13" containerName="container-00" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.181410 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e85fbc9-8159-4114-b225-a8dc3ace9b13" containerName="container-00" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.183345 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.222184 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-skdmv"] Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.320447 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-catalog-content\") pod \"community-operators-skdmv\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.320798 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-utilities\") pod \"community-operators-skdmv\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.320943 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbnzp\" (UniqueName: \"kubernetes.io/projected/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-kube-api-access-nbnzp\") pod \"community-operators-skdmv\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.424031 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-catalog-content\") pod \"community-operators-skdmv\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.424622 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-catalog-content\") pod \"community-operators-skdmv\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.424788 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-utilities\") pod \"community-operators-skdmv\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.424818 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbnzp\" (UniqueName: \"kubernetes.io/projected/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-kube-api-access-nbnzp\") pod \"community-operators-skdmv\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.425050 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-utilities\") pod \"community-operators-skdmv\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.455280 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbnzp\" (UniqueName: \"kubernetes.io/projected/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-kube-api-access-nbnzp\") pod \"community-operators-skdmv\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:03 crc kubenswrapper[4585]: I0215 17:52:03.512582 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:04 crc kubenswrapper[4585]: I0215 17:52:04.315920 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-skdmv"] Feb 15 17:52:04 crc kubenswrapper[4585]: I0215 17:52:04.635637 4585 generic.go:334] "Generic (PLEG): container finished" podID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerID="bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f" exitCode=0 Feb 15 17:52:04 crc kubenswrapper[4585]: I0215 17:52:04.635672 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skdmv" event={"ID":"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb","Type":"ContainerDied","Data":"bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f"} Feb 15 17:52:04 crc kubenswrapper[4585]: I0215 17:52:04.635696 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skdmv" event={"ID":"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb","Type":"ContainerStarted","Data":"8d207176c5bb5d1013ecc6bac6add2be4165513f09ce3ee7c774776800444de2"} Feb 15 17:52:05 crc kubenswrapper[4585]: I0215 17:52:05.646456 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skdmv" event={"ID":"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb","Type":"ContainerStarted","Data":"dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6"} Feb 15 17:52:07 crc kubenswrapper[4585]: I0215 17:52:07.669472 4585 generic.go:334] "Generic (PLEG): container finished" podID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerID="dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6" exitCode=0 Feb 15 17:52:07 crc kubenswrapper[4585]: I0215 17:52:07.669515 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skdmv" event={"ID":"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb","Type":"ContainerDied","Data":"dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6"} Feb 15 17:52:08 crc kubenswrapper[4585]: I0215 17:52:08.681374 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skdmv" event={"ID":"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb","Type":"ContainerStarted","Data":"e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87"} Feb 15 17:52:08 crc kubenswrapper[4585]: I0215 17:52:08.709117 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-skdmv" podStartSLOduration=2.290872959 podStartE2EDuration="5.709099223s" podCreationTimestamp="2026-02-15 17:52:03 +0000 UTC" firstStartedPulling="2026-02-15 17:52:04.638528563 +0000 UTC m=+2780.581936695" lastFinishedPulling="2026-02-15 17:52:08.056754817 +0000 UTC m=+2784.000162959" observedRunningTime="2026-02-15 17:52:08.705821855 +0000 UTC m=+2784.649229997" watchObservedRunningTime="2026-02-15 17:52:08.709099223 +0000 UTC m=+2784.652507355" Feb 15 17:52:13 crc kubenswrapper[4585]: I0215 17:52:13.513717 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:13 crc kubenswrapper[4585]: I0215 17:52:13.514179 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:14 crc kubenswrapper[4585]: I0215 17:52:14.560021 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-skdmv" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerName="registry-server" probeResult="failure" output=< Feb 15 17:52:14 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:52:14 crc kubenswrapper[4585]: > Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.145981 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f6d69d-zt7nc_00bfac9c-5e69-41c0-813a-c163bb169b0d/barbican-api/0.log" Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.294560 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f6d69d-zt7nc_00bfac9c-5e69-41c0-813a-c163bb169b0d/barbican-api-log/0.log" Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.397057 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-765946969b-cdgqp_42089924-7f2c-40bd-a930-74f9ae10b784/barbican-keystone-listener/0.log" Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.416870 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-765946969b-cdgqp_42089924-7f2c-40bd-a930-74f9ae10b784/barbican-keystone-listener-log/0.log" Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.642146 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-666c6c699f-9zzq7_8e924d6a-e504-41b5-8268-f6df32a3e507/barbican-worker/0.log" Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.666724 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-666c6c699f-9zzq7_8e924d6a-e504-41b5-8268-f6df32a3e507/barbican-worker-log/0.log" Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.788944 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e37c5e7e-27a1-4ca2-a04d-588392a0f115/ceilometer-central-agent/0.log" Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.927831 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e37c5e7e-27a1-4ca2-a04d-588392a0f115/ceilometer-notification-agent/0.log" Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.962466 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e37c5e7e-27a1-4ca2-a04d-588392a0f115/sg-core/0.log" Feb 15 17:52:22 crc kubenswrapper[4585]: I0215 17:52:22.993647 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e37c5e7e-27a1-4ca2-a04d-588392a0f115/proxy-httpd/0.log" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.148589 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8636bbb9-0fc2-4481-b149-bc30884e3819/cinder-api/0.log" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.186448 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8636bbb9-0fc2-4481-b149-bc30884e3819/cinder-api-log/0.log" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.336950 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_44c6f599-40fd-4592-9275-f1158b3126b0/probe/0.log" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.387865 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_44c6f599-40fd-4592-9275-f1158b3126b0/cinder-scheduler/0.log" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.544073 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cd5cbd7b9-mbrdb_8982df7b-970d-41de-8ee6-2a71c14facb9/init/0.log" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.558251 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.611380 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.770952 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cd5cbd7b9-mbrdb_8982df7b-970d-41de-8ee6-2a71c14facb9/init/0.log" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.801763 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-skdmv"] Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.807714 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cd5cbd7b9-mbrdb_8982df7b-970d-41de-8ee6-2a71c14facb9/dnsmasq-dns/0.log" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.818174 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_09414c20-b9cb-44ce-a829-112cc2f307d1/glance-httpd/0.log" Feb 15 17:52:23 crc kubenswrapper[4585]: I0215 17:52:23.962878 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_09414c20-b9cb-44ce-a829-112cc2f307d1/glance-log/0.log" Feb 15 17:52:24 crc kubenswrapper[4585]: I0215 17:52:24.064419 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3c0fc65d-8f21-4e29-819a-06cb632e02cf/glance-httpd/0.log" Feb 15 17:52:24 crc kubenswrapper[4585]: I0215 17:52:24.088904 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3c0fc65d-8f21-4e29-819a-06cb632e02cf/glance-log/0.log" Feb 15 17:52:24 crc kubenswrapper[4585]: I0215 17:52:24.279421 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5fb7dd448-vc5x5_b1bd46e7-0703-49b5-81f2-516568284547/horizon/2.log" Feb 15 17:52:24 crc kubenswrapper[4585]: I0215 17:52:24.332390 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5fb7dd448-vc5x5_b1bd46e7-0703-49b5-81f2-516568284547/horizon/1.log" Feb 15 17:52:24 crc kubenswrapper[4585]: I0215 17:52:24.521178 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5fb7dd448-vc5x5_b1bd46e7-0703-49b5-81f2-516568284547/horizon-log/0.log" Feb 15 17:52:24 crc kubenswrapper[4585]: I0215 17:52:24.629048 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-756b6f8c5f-5zd85_15f27cce-5856-41d0-8528-95eba7431a98/keystone-api/0.log" Feb 15 17:52:24 crc kubenswrapper[4585]: I0215 17:52:24.780944 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_23bad4a7-c77d-4720-92df-7d126a0f079c/kube-state-metrics/0.log" Feb 15 17:52:24 crc kubenswrapper[4585]: I0215 17:52:24.830045 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-skdmv" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerName="registry-server" containerID="cri-o://e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87" gracePeriod=2 Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.184106 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-65d8658977-kc9xn_732fcac3-39e1-4937-9a97-f243a37bc41b/neutron-api/0.log" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.287182 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-65d8658977-kc9xn_732fcac3-39e1-4937-9a97-f243a37bc41b/neutron-httpd/0.log" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.360520 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.486056 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-catalog-content\") pod \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.486129 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-utilities\") pod \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.486209 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbnzp\" (UniqueName: \"kubernetes.io/projected/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-kube-api-access-nbnzp\") pod \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\" (UID: \"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb\") " Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.487722 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-utilities" (OuterVolumeSpecName: "utilities") pod "efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" (UID: "efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.515473 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-kube-api-access-nbnzp" (OuterVolumeSpecName: "kube-api-access-nbnzp") pod "efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" (UID: "efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb"). InnerVolumeSpecName "kube-api-access-nbnzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.554156 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" (UID: "efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.587893 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.587925 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.587936 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbnzp\" (UniqueName: \"kubernetes.io/projected/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb-kube-api-access-nbnzp\") on node \"crc\" DevicePath \"\"" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.715777 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_61a2428b-0f64-47db-b464-cc21bac70b83/nova-api-log/0.log" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.747480 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_61a2428b-0f64-47db-b464-cc21bac70b83/nova-api-api/0.log" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.845133 4585 generic.go:334] "Generic (PLEG): container finished" podID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerID="e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87" exitCode=0 Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.845641 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-skdmv" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.845206 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skdmv" event={"ID":"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb","Type":"ContainerDied","Data":"e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87"} Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.847017 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-skdmv" event={"ID":"efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb","Type":"ContainerDied","Data":"8d207176c5bb5d1013ecc6bac6add2be4165513f09ce3ee7c774776800444de2"} Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.847038 4585 scope.go:117] "RemoveContainer" containerID="e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.889229 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-skdmv"] Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.890239 4585 scope.go:117] "RemoveContainer" containerID="dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.902315 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-skdmv"] Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.924753 4585 scope.go:117] "RemoveContainer" containerID="bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.966883 4585 scope.go:117] "RemoveContainer" containerID="e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87" Feb 15 17:52:25 crc kubenswrapper[4585]: E0215 17:52:25.967534 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87\": container with ID starting with e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87 not found: ID does not exist" containerID="e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.967564 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87"} err="failed to get container status \"e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87\": rpc error: code = NotFound desc = could not find container \"e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87\": container with ID starting with e73a5ea6d16218de48212cbf20e1db2deb1f28ff23db9dd2b8ce7c882ccdea87 not found: ID does not exist" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.967582 4585 scope.go:117] "RemoveContainer" containerID="dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6" Feb 15 17:52:25 crc kubenswrapper[4585]: E0215 17:52:25.967901 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6\": container with ID starting with dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6 not found: ID does not exist" containerID="dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.967921 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6"} err="failed to get container status \"dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6\": rpc error: code = NotFound desc = could not find container \"dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6\": container with ID starting with dd93dfbf8cb1214772585973c8393aac44690271fa6ed1fe4ec00300cbc092a6 not found: ID does not exist" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.967935 4585 scope.go:117] "RemoveContainer" containerID="bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f" Feb 15 17:52:25 crc kubenswrapper[4585]: E0215 17:52:25.968200 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f\": container with ID starting with bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f not found: ID does not exist" containerID="bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f" Feb 15 17:52:25 crc kubenswrapper[4585]: I0215 17:52:25.968224 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f"} err="failed to get container status \"bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f\": rpc error: code = NotFound desc = could not find container \"bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f\": container with ID starting with bc3d13c7401d8a1bb701aeda1ee4eb5ebf75fa0368a1bd4df515f8c44848932f not found: ID does not exist" Feb 15 17:52:26 crc kubenswrapper[4585]: I0215 17:52:26.073888 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_254bedb1-5fad-4481-a643-4c7b6872eaf0/nova-cell0-conductor-conductor/0.log" Feb 15 17:52:26 crc kubenswrapper[4585]: I0215 17:52:26.192267 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3/nova-metadata-log/0.log" Feb 15 17:52:26 crc kubenswrapper[4585]: I0215 17:52:26.555159 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4170aae6-85c6-408f-a00b-2e3869fae11e/nova-scheduler-scheduler/0.log" Feb 15 17:52:26 crc kubenswrapper[4585]: I0215 17:52:26.751275 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_858caedb-5be5-49ce-b806-489e3c0531b5/mysql-bootstrap/0.log" Feb 15 17:52:26 crc kubenswrapper[4585]: I0215 17:52:26.860240 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" path="/var/lib/kubelet/pods/efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb/volumes" Feb 15 17:52:27 crc kubenswrapper[4585]: I0215 17:52:27.012830 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_858caedb-5be5-49ce-b806-489e3c0531b5/mysql-bootstrap/0.log" Feb 15 17:52:27 crc kubenswrapper[4585]: I0215 17:52:27.033315 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_858caedb-5be5-49ce-b806-489e3c0531b5/galera/0.log" Feb 15 17:52:27 crc kubenswrapper[4585]: I0215 17:52:27.198990 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f0be2ff3-7a84-4d4e-9fab-f1b2e44bb4a3/nova-metadata-metadata/0.log" Feb 15 17:52:27 crc kubenswrapper[4585]: I0215 17:52:27.290844 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c/mysql-bootstrap/0.log" Feb 15 17:52:27 crc kubenswrapper[4585]: I0215 17:52:27.450650 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c/galera/0.log" Feb 15 17:52:27 crc kubenswrapper[4585]: I0215 17:52:27.515204 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0fe1a2fc-1a9a-4409-a5a9-29ef7e2ca78c/mysql-bootstrap/0.log" Feb 15 17:52:27 crc kubenswrapper[4585]: I0215 17:52:27.532509 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3682ca9c-f964-4e4b-ba4a-489a96ef3f65/openstackclient/0.log" Feb 15 17:52:27 crc kubenswrapper[4585]: I0215 17:52:27.727537 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-87pkc_2ebbfc0f-5cfc-4e44-81e3-dfc5d67ec8b9/ovn-controller/0.log" Feb 15 17:52:27 crc kubenswrapper[4585]: I0215 17:52:27.775799 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ntdrw_a706b455-b1aa-4b2d-9ee3-714cb8801089/openstack-network-exporter/0.log" Feb 15 17:52:28 crc kubenswrapper[4585]: I0215 17:52:28.011482 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hzct7_50cd83ac-87a5-46e8-be00-9b8cf954efe0/ovsdb-server-init/0.log" Feb 15 17:52:28 crc kubenswrapper[4585]: I0215 17:52:28.281456 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hzct7_50cd83ac-87a5-46e8-be00-9b8cf954efe0/ovsdb-server-init/0.log" Feb 15 17:52:28 crc kubenswrapper[4585]: I0215 17:52:28.321982 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hzct7_50cd83ac-87a5-46e8-be00-9b8cf954efe0/ovs-vswitchd/0.log" Feb 15 17:52:28 crc kubenswrapper[4585]: I0215 17:52:28.334196 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hzct7_50cd83ac-87a5-46e8-be00-9b8cf954efe0/ovsdb-server/0.log" Feb 15 17:52:28 crc kubenswrapper[4585]: I0215 17:52:28.537266 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25a62375-a3a2-44ec-b5e3-e03e3da6257e/openstack-network-exporter/0.log" Feb 15 17:52:28 crc kubenswrapper[4585]: I0215 17:52:28.647507 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_25a62375-a3a2-44ec-b5e3-e03e3da6257e/ovn-northd/0.log" Feb 15 17:52:28 crc kubenswrapper[4585]: I0215 17:52:28.769329 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ba94e4cb-032f-451a-afd7-f908bef47709/openstack-network-exporter/0.log" Feb 15 17:52:28 crc kubenswrapper[4585]: I0215 17:52:28.811040 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ba94e4cb-032f-451a-afd7-f908bef47709/ovsdbserver-nb/0.log" Feb 15 17:52:28 crc kubenswrapper[4585]: I0215 17:52:28.960230 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e892e065-8113-4612-a7f6-808490c8b000/openstack-network-exporter/0.log" Feb 15 17:52:29 crc kubenswrapper[4585]: I0215 17:52:29.027336 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e892e065-8113-4612-a7f6-808490c8b000/ovsdbserver-sb/0.log" Feb 15 17:52:29 crc kubenswrapper[4585]: I0215 17:52:29.273350 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5dfc4c95db-jlklr_60d01fff-4dd5-4cc0-9cce-06d41728c238/placement-log/0.log" Feb 15 17:52:29 crc kubenswrapper[4585]: I0215 17:52:29.339721 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5dfc4c95db-jlklr_60d01fff-4dd5-4cc0-9cce-06d41728c238/placement-api/0.log" Feb 15 17:52:29 crc kubenswrapper[4585]: I0215 17:52:29.470725 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9de65e3c-3874-4fc0-9566-84138bb228b7/setup-container/0.log" Feb 15 17:52:29 crc kubenswrapper[4585]: I0215 17:52:29.583418 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9de65e3c-3874-4fc0-9566-84138bb228b7/setup-container/0.log" Feb 15 17:52:29 crc kubenswrapper[4585]: I0215 17:52:29.626526 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9de65e3c-3874-4fc0-9566-84138bb228b7/rabbitmq/0.log" Feb 15 17:52:29 crc kubenswrapper[4585]: I0215 17:52:29.780127 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d9f8f5c8c-pfm4d_827a8b91-c0e1-4ba9-a90a-e0767e9fb71e/proxy-httpd/0.log" Feb 15 17:52:29 crc kubenswrapper[4585]: I0215 17:52:29.841827 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d9f8f5c8c-pfm4d_827a8b91-c0e1-4ba9-a90a-e0767e9fb71e/proxy-server/0.log" Feb 15 17:52:29 crc kubenswrapper[4585]: I0215 17:52:29.964427 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-6ls6x_fd5d7c58-38f8-40cb-89c0-6f97f6063ca6/swift-ring-rebalance/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.114748 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/account-auditor/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.139278 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/account-reaper/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.293049 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/account-replicator/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.390025 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/container-auditor/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.429004 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/account-server/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.482828 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/container-replicator/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.565911 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/container-server/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.620741 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/object-auditor/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.668036 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/container-updater/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.792679 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/object-replicator/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.836844 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/object-expirer/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.944454 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/object-server/0.log" Feb 15 17:52:30 crc kubenswrapper[4585]: I0215 17:52:30.948211 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/object-updater/0.log" Feb 15 17:52:31 crc kubenswrapper[4585]: I0215 17:52:31.022275 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/rsync/0.log" Feb 15 17:52:31 crc kubenswrapper[4585]: I0215 17:52:31.050398 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92c8f627-225d-40ac-827b-d2f3476c1768/swift-recon-cron/0.log" Feb 15 17:52:34 crc kubenswrapper[4585]: I0215 17:52:34.494068 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_66d1b09a-b816-4f0f-be98-6963462597ab/memcached/0.log" Feb 15 17:52:47 crc kubenswrapper[4585]: I0215 17:52:47.014166 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:52:47 crc kubenswrapper[4585]: I0215 17:52:47.015230 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:52:56 crc kubenswrapper[4585]: I0215 17:52:56.343649 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq_b9fb975a-75bb-42c7-862e-50a72ccb6c1e/util/0.log" Feb 15 17:52:56 crc kubenswrapper[4585]: I0215 17:52:56.542737 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq_b9fb975a-75bb-42c7-862e-50a72ccb6c1e/pull/0.log" Feb 15 17:52:56 crc kubenswrapper[4585]: I0215 17:52:56.597118 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq_b9fb975a-75bb-42c7-862e-50a72ccb6c1e/pull/0.log" Feb 15 17:52:56 crc kubenswrapper[4585]: I0215 17:52:56.641993 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq_b9fb975a-75bb-42c7-862e-50a72ccb6c1e/util/0.log" Feb 15 17:52:56 crc kubenswrapper[4585]: I0215 17:52:56.784938 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq_b9fb975a-75bb-42c7-862e-50a72ccb6c1e/pull/0.log" Feb 15 17:52:56 crc kubenswrapper[4585]: I0215 17:52:56.798993 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq_b9fb975a-75bb-42c7-862e-50a72ccb6c1e/util/0.log" Feb 15 17:52:56 crc kubenswrapper[4585]: I0215 17:52:56.835143 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3d7e9d942328567b08c0f4bb65590213d34c7c4ad0dfa9e0663e9d569dzdfcq_b9fb975a-75bb-42c7-862e-50a72ccb6c1e/extract/0.log" Feb 15 17:52:57 crc kubenswrapper[4585]: I0215 17:52:57.443540 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-vs4dd_c1b6598f-2367-4118-9e8c-90018190d1fb/manager/0.log" Feb 15 17:52:57 crc kubenswrapper[4585]: I0215 17:52:57.895906 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-j9lfr_72febc8a-8640-43c3-a6d2-5ca8156d827a/manager/0.log" Feb 15 17:52:58 crc kubenswrapper[4585]: I0215 17:52:58.153655 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-zf88x_91b09b7a-0686-4b3b-8aa5-4596b1fb5ec2/manager/0.log" Feb 15 17:52:58 crc kubenswrapper[4585]: I0215 17:52:58.403319 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-p8qvp_64500064-0cb9-4c1e-9370-960a7aa9617c/manager/0.log" Feb 15 17:52:58 crc kubenswrapper[4585]: I0215 17:52:58.443696 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-p75lp_345af412-80a4-4d2b-9738-9ed005847c6a/manager/0.log" Feb 15 17:52:58 crc kubenswrapper[4585]: I0215 17:52:58.965422 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-chqj4_d32edc1b-dcb5-4338-801c-fb1657a78892/manager/0.log" Feb 15 17:52:59 crc kubenswrapper[4585]: I0215 17:52:59.156720 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7c4bfc5b96-482zr_9ee8eb66-b9ff-4ab0-8fb9-8a85a4511414/manager/0.log" Feb 15 17:52:59 crc kubenswrapper[4585]: I0215 17:52:59.310519 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-btbfc_cb189c3a-a98e-4fc4-b074-ce2e17b2950b/manager/0.log" Feb 15 17:52:59 crc kubenswrapper[4585]: I0215 17:52:59.458832 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-5cqqm_3d704152-30bf-4588-ba21-bc5f23265fb6/manager/0.log" Feb 15 17:52:59 crc kubenswrapper[4585]: I0215 17:52:59.723070 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-f2wrc_1bf81164-cb02-4381-a1e5-b28b7648f613/manager/0.log" Feb 15 17:53:00 crc kubenswrapper[4585]: I0215 17:53:00.003874 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-ndnrr_abd1d7bd-5fd3-41cc-a9b1-1cd9ac79dd7b/manager/0.log" Feb 15 17:53:00 crc kubenswrapper[4585]: I0215 17:53:00.183255 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-ztxlq_afc8afe1-32b1-47ef-9fd4-6331fec926f5/manager/0.log" Feb 15 17:53:00 crc kubenswrapper[4585]: I0215 17:53:00.505102 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84966cf5c4dw6jt_4378a5c2-4e4a-422a-9cd5-b55433ac3fbe/manager/0.log" Feb 15 17:53:01 crc kubenswrapper[4585]: I0215 17:53:01.035166 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-567dc79d78-vrx9l_e70b72bf-8467-4e87-b021-be653a6d218e/operator/0.log" Feb 15 17:53:01 crc kubenswrapper[4585]: I0215 17:53:01.465446 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-fvsm5_58be13ef-eba2-4fdd-9fff-2b96d1b38143/manager/0.log" Feb 15 17:53:01 crc kubenswrapper[4585]: I0215 17:53:01.660113 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-svdn7_725f7d01-0363-4872-9e03-494df9cdd50a/registry-server/0.log" Feb 15 17:53:01 crc kubenswrapper[4585]: I0215 17:53:01.886766 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-8tzvv_7796cdbe-ab03-4bac-b2cc-e828e42f438f/manager/0.log" Feb 15 17:53:02 crc kubenswrapper[4585]: I0215 17:53:02.018921 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-7s6mk_e447b2b4-5bfe-4481-a36a-241124fd507a/manager/0.log" Feb 15 17:53:02 crc kubenswrapper[4585]: I0215 17:53:02.179687 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-4jh78_144ca353-5e11-4eab-a29e-71e41e63ea9f/operator/0.log" Feb 15 17:53:02 crc kubenswrapper[4585]: I0215 17:53:02.396304 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-5dh24_58ebd0ec-b27f-493b-accb-7c43c2408f19/manager/0.log" Feb 15 17:53:02 crc kubenswrapper[4585]: I0215 17:53:02.437418 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66bb5545bf-x7sb4_4a8513b3-0e8a-44d3-9fe6-781cac50db0a/manager/0.log" Feb 15 17:53:02 crc kubenswrapper[4585]: I0215 17:53:02.675841 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-rh8kk_8a5c9ab0-1600-4130-8293-6672efc2188d/manager/0.log" Feb 15 17:53:02 crc kubenswrapper[4585]: I0215 17:53:02.692423 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-2l54v_cc9f7d61-65f7-44d8-8fc6-2e0c8c32cdf8/manager/0.log" Feb 15 17:53:02 crc kubenswrapper[4585]: I0215 17:53:02.914358 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-vvfcn_82f1699a-e706-412e-af47-89b0ed090f92/manager/0.log" Feb 15 17:53:02 crc kubenswrapper[4585]: I0215 17:53:02.989065 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-rr6q7_a0a88360-7506-4420-a652-8abb63a4f2ea/manager/0.log" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.014256 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.014732 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.092748 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5f2bg"] Feb 15 17:53:17 crc kubenswrapper[4585]: E0215 17:53:17.093324 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerName="extract-content" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.093348 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerName="extract-content" Feb 15 17:53:17 crc kubenswrapper[4585]: E0215 17:53:17.093384 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerName="registry-server" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.093393 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerName="registry-server" Feb 15 17:53:17 crc kubenswrapper[4585]: E0215 17:53:17.093407 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerName="extract-utilities" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.093413 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerName="extract-utilities" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.093646 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd2e5dd-d1c6-4cb1-ac7a-6e2d0ebba7cb" containerName="registry-server" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.095614 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.104293 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5f2bg"] Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.234003 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-catalog-content\") pod \"redhat-operators-5f2bg\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.234047 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-utilities\") pod \"redhat-operators-5f2bg\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.234117 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czh2t\" (UniqueName: \"kubernetes.io/projected/84673f20-adb3-4639-b831-ba948cbe9410-kube-api-access-czh2t\") pod \"redhat-operators-5f2bg\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.335739 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-catalog-content\") pod \"redhat-operators-5f2bg\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.335800 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-utilities\") pod \"redhat-operators-5f2bg\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.335847 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czh2t\" (UniqueName: \"kubernetes.io/projected/84673f20-adb3-4639-b831-ba948cbe9410-kube-api-access-czh2t\") pod \"redhat-operators-5f2bg\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.336230 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-catalog-content\") pod \"redhat-operators-5f2bg\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.336294 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-utilities\") pod \"redhat-operators-5f2bg\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.364412 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czh2t\" (UniqueName: \"kubernetes.io/projected/84673f20-adb3-4639-b831-ba948cbe9410-kube-api-access-czh2t\") pod \"redhat-operators-5f2bg\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:17 crc kubenswrapper[4585]: I0215 17:53:17.426671 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:18 crc kubenswrapper[4585]: I0215 17:53:18.068665 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5f2bg"] Feb 15 17:53:18 crc kubenswrapper[4585]: I0215 17:53:18.408544 4585 generic.go:334] "Generic (PLEG): container finished" podID="84673f20-adb3-4639-b831-ba948cbe9410" containerID="68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb" exitCode=0 Feb 15 17:53:18 crc kubenswrapper[4585]: I0215 17:53:18.408663 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2bg" event={"ID":"84673f20-adb3-4639-b831-ba948cbe9410","Type":"ContainerDied","Data":"68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb"} Feb 15 17:53:18 crc kubenswrapper[4585]: I0215 17:53:18.408873 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2bg" event={"ID":"84673f20-adb3-4639-b831-ba948cbe9410","Type":"ContainerStarted","Data":"5ed277fcc676955f804065cc3e81dd97ce20c26e213f3cd62b6d69c44fbf5dab"} Feb 15 17:53:19 crc kubenswrapper[4585]: I0215 17:53:19.420559 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2bg" event={"ID":"84673f20-adb3-4639-b831-ba948cbe9410","Type":"ContainerStarted","Data":"6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132"} Feb 15 17:53:24 crc kubenswrapper[4585]: I0215 17:53:24.177542 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zfq7z_389330df-47c0-4815-9070-2664655acaab/control-plane-machine-set-operator/0.log" Feb 15 17:53:24 crc kubenswrapper[4585]: I0215 17:53:24.360806 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qdp6s_d76d195a-b0da-4f95-9bc3-a7d9510e749a/kube-rbac-proxy/0.log" Feb 15 17:53:24 crc kubenswrapper[4585]: I0215 17:53:24.443290 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qdp6s_d76d195a-b0da-4f95-9bc3-a7d9510e749a/machine-api-operator/0.log" Feb 15 17:53:24 crc kubenswrapper[4585]: I0215 17:53:24.468484 4585 generic.go:334] "Generic (PLEG): container finished" podID="84673f20-adb3-4639-b831-ba948cbe9410" containerID="6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132" exitCode=0 Feb 15 17:53:24 crc kubenswrapper[4585]: I0215 17:53:24.468519 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2bg" event={"ID":"84673f20-adb3-4639-b831-ba948cbe9410","Type":"ContainerDied","Data":"6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132"} Feb 15 17:53:25 crc kubenswrapper[4585]: I0215 17:53:25.477227 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2bg" event={"ID":"84673f20-adb3-4639-b831-ba948cbe9410","Type":"ContainerStarted","Data":"8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b"} Feb 15 17:53:25 crc kubenswrapper[4585]: I0215 17:53:25.493173 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5f2bg" podStartSLOduration=2.047667057 podStartE2EDuration="8.493157321s" podCreationTimestamp="2026-02-15 17:53:17 +0000 UTC" firstStartedPulling="2026-02-15 17:53:18.410226333 +0000 UTC m=+2854.353634465" lastFinishedPulling="2026-02-15 17:53:24.855716607 +0000 UTC m=+2860.799124729" observedRunningTime="2026-02-15 17:53:25.491719212 +0000 UTC m=+2861.435127354" watchObservedRunningTime="2026-02-15 17:53:25.493157321 +0000 UTC m=+2861.436565453" Feb 15 17:53:27 crc kubenswrapper[4585]: I0215 17:53:27.427845 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:27 crc kubenswrapper[4585]: I0215 17:53:27.428046 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:28 crc kubenswrapper[4585]: I0215 17:53:28.478262 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5f2bg" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="registry-server" probeResult="failure" output=< Feb 15 17:53:28 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:53:28 crc kubenswrapper[4585]: > Feb 15 17:53:38 crc kubenswrapper[4585]: I0215 17:53:38.487414 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5f2bg" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="registry-server" probeResult="failure" output=< Feb 15 17:53:38 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Feb 15 17:53:38 crc kubenswrapper[4585]: > Feb 15 17:53:38 crc kubenswrapper[4585]: I0215 17:53:38.853372 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-gvnsl_260f1bf0-a58d-492f-9939-30c20b324a78/cert-manager-controller/0.log" Feb 15 17:53:39 crc kubenswrapper[4585]: I0215 17:53:39.006717 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-kqnpg_a53d4900-9905-4d34-acbf-4ef911683c2c/cert-manager-cainjector/0.log" Feb 15 17:53:39 crc kubenswrapper[4585]: I0215 17:53:39.119533 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-v8flx_ee9f4317-b1fe-44a6-a4df-f14563acc190/cert-manager-webhook/0.log" Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.015921 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.016578 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.016644 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.017181 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44"} pod="openshift-machine-config-operator/machine-config-daemon-4hptv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.017230 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" containerID="cri-o://667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" gracePeriod=600 Feb 15 17:53:47 crc kubenswrapper[4585]: E0215 17:53:47.136985 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.522131 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.586490 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.679486 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" exitCode=0 Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.679577 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerDied","Data":"667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44"} Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.679717 4585 scope.go:117] "RemoveContainer" containerID="707e2cabdae5fd4224842942496f67e0a37a5dc60a81ac67d588c49fa31af510" Feb 15 17:53:47 crc kubenswrapper[4585]: I0215 17:53:47.680313 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:53:47 crc kubenswrapper[4585]: E0215 17:53:47.680631 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:53:48 crc kubenswrapper[4585]: I0215 17:53:48.292524 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5f2bg"] Feb 15 17:53:48 crc kubenswrapper[4585]: I0215 17:53:48.692391 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5f2bg" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="registry-server" containerID="cri-o://8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b" gracePeriod=2 Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.171786 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.265072 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czh2t\" (UniqueName: \"kubernetes.io/projected/84673f20-adb3-4639-b831-ba948cbe9410-kube-api-access-czh2t\") pod \"84673f20-adb3-4639-b831-ba948cbe9410\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.265216 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-utilities\") pod \"84673f20-adb3-4639-b831-ba948cbe9410\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.265352 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-catalog-content\") pod \"84673f20-adb3-4639-b831-ba948cbe9410\" (UID: \"84673f20-adb3-4639-b831-ba948cbe9410\") " Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.283177 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-utilities" (OuterVolumeSpecName: "utilities") pod "84673f20-adb3-4639-b831-ba948cbe9410" (UID: "84673f20-adb3-4639-b831-ba948cbe9410"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.285104 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84673f20-adb3-4639-b831-ba948cbe9410-kube-api-access-czh2t" (OuterVolumeSpecName: "kube-api-access-czh2t") pod "84673f20-adb3-4639-b831-ba948cbe9410" (UID: "84673f20-adb3-4639-b831-ba948cbe9410"). InnerVolumeSpecName "kube-api-access-czh2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.367293 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czh2t\" (UniqueName: \"kubernetes.io/projected/84673f20-adb3-4639-b831-ba948cbe9410-kube-api-access-czh2t\") on node \"crc\" DevicePath \"\"" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.367324 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.397757 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84673f20-adb3-4639-b831-ba948cbe9410" (UID: "84673f20-adb3-4639-b831-ba948cbe9410"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.469766 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84673f20-adb3-4639-b831-ba948cbe9410-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.715056 4585 generic.go:334] "Generic (PLEG): container finished" podID="84673f20-adb3-4639-b831-ba948cbe9410" containerID="8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b" exitCode=0 Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.715118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2bg" event={"ID":"84673f20-adb3-4639-b831-ba948cbe9410","Type":"ContainerDied","Data":"8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b"} Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.715161 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2bg" event={"ID":"84673f20-adb3-4639-b831-ba948cbe9410","Type":"ContainerDied","Data":"5ed277fcc676955f804065cc3e81dd97ce20c26e213f3cd62b6d69c44fbf5dab"} Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.715191 4585 scope.go:117] "RemoveContainer" containerID="8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.715304 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f2bg" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.769775 4585 scope.go:117] "RemoveContainer" containerID="6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.776376 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5f2bg"] Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.792044 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5f2bg"] Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.816780 4585 scope.go:117] "RemoveContainer" containerID="68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.862948 4585 scope.go:117] "RemoveContainer" containerID="8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b" Feb 15 17:53:49 crc kubenswrapper[4585]: E0215 17:53:49.864500 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b\": container with ID starting with 8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b not found: ID does not exist" containerID="8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.864528 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b"} err="failed to get container status \"8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b\": rpc error: code = NotFound desc = could not find container \"8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b\": container with ID starting with 8b2f08a1282616ce2f87b2f8c118b6543c503fa0a2c15179e0aa50f93bfdc77b not found: ID does not exist" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.864545 4585 scope.go:117] "RemoveContainer" containerID="6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132" Feb 15 17:53:49 crc kubenswrapper[4585]: E0215 17:53:49.865200 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132\": container with ID starting with 6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132 not found: ID does not exist" containerID="6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.865230 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132"} err="failed to get container status \"6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132\": rpc error: code = NotFound desc = could not find container \"6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132\": container with ID starting with 6ab7a80036a96146834ecc9a09699b879c16b7e1a455e2bbde01574f3e5f7132 not found: ID does not exist" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.865243 4585 scope.go:117] "RemoveContainer" containerID="68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb" Feb 15 17:53:49 crc kubenswrapper[4585]: E0215 17:53:49.865565 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb\": container with ID starting with 68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb not found: ID does not exist" containerID="68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb" Feb 15 17:53:49 crc kubenswrapper[4585]: I0215 17:53:49.865585 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb"} err="failed to get container status \"68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb\": rpc error: code = NotFound desc = could not find container \"68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb\": container with ID starting with 68a3ebed2d3b9e217a7a82d90d11186aafb027d0534789cb6ea7e67324230ddb not found: ID does not exist" Feb 15 17:53:50 crc kubenswrapper[4585]: I0215 17:53:50.856718 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84673f20-adb3-4639-b831-ba948cbe9410" path="/var/lib/kubelet/pods/84673f20-adb3-4639-b831-ba948cbe9410/volumes" Feb 15 17:53:53 crc kubenswrapper[4585]: I0215 17:53:53.401319 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-v6dds_06076f8a-c221-46f7-a72b-2287367d08c8/nmstate-console-plugin/0.log" Feb 15 17:53:53 crc kubenswrapper[4585]: I0215 17:53:53.518999 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-9zlpw_7e13449a-7362-4fca-98c9-8ba86698e6e7/nmstate-handler/0.log" Feb 15 17:53:53 crc kubenswrapper[4585]: I0215 17:53:53.674375 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-d8x4k_fd6e126c-f69c-4a79-932b-976d3cb97f83/nmstate-metrics/0.log" Feb 15 17:53:53 crc kubenswrapper[4585]: I0215 17:53:53.675775 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-d8x4k_fd6e126c-f69c-4a79-932b-976d3cb97f83/kube-rbac-proxy/0.log" Feb 15 17:53:53 crc kubenswrapper[4585]: I0215 17:53:53.870675 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-zk68t_b3c69254-35c7-4f91-b059-6a72be7af29f/nmstate-webhook/0.log" Feb 15 17:53:53 crc kubenswrapper[4585]: I0215 17:53:53.873639 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-rbvbt_db3179ce-d468-434b-9f2c-7fce08fb2ce3/nmstate-operator/0.log" Feb 15 17:53:59 crc kubenswrapper[4585]: I0215 17:53:59.842740 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:53:59 crc kubenswrapper[4585]: E0215 17:53:59.843536 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:54:01 crc kubenswrapper[4585]: I0215 17:54:01.937329 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6xq2g"] Feb 15 17:54:01 crc kubenswrapper[4585]: E0215 17:54:01.938066 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="registry-server" Feb 15 17:54:01 crc kubenswrapper[4585]: I0215 17:54:01.938080 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="registry-server" Feb 15 17:54:01 crc kubenswrapper[4585]: E0215 17:54:01.938099 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="extract-utilities" Feb 15 17:54:01 crc kubenswrapper[4585]: I0215 17:54:01.938104 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="extract-utilities" Feb 15 17:54:01 crc kubenswrapper[4585]: E0215 17:54:01.938124 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="extract-content" Feb 15 17:54:01 crc kubenswrapper[4585]: I0215 17:54:01.938129 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="extract-content" Feb 15 17:54:01 crc kubenswrapper[4585]: I0215 17:54:01.938348 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="84673f20-adb3-4639-b831-ba948cbe9410" containerName="registry-server" Feb 15 17:54:01 crc kubenswrapper[4585]: I0215 17:54:01.940365 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:01 crc kubenswrapper[4585]: I0215 17:54:01.953535 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xq2g"] Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.055478 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-catalog-content\") pod \"redhat-marketplace-6xq2g\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.055862 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-utilities\") pod \"redhat-marketplace-6xq2g\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.056025 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzxp8\" (UniqueName: \"kubernetes.io/projected/124fe4ca-8f5e-4405-97c5-edf3963a1534-kube-api-access-fzxp8\") pod \"redhat-marketplace-6xq2g\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.159934 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-catalog-content\") pod \"redhat-marketplace-6xq2g\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.160369 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-catalog-content\") pod \"redhat-marketplace-6xq2g\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.160504 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-utilities\") pod \"redhat-marketplace-6xq2g\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.160781 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-utilities\") pod \"redhat-marketplace-6xq2g\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.160812 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzxp8\" (UniqueName: \"kubernetes.io/projected/124fe4ca-8f5e-4405-97c5-edf3963a1534-kube-api-access-fzxp8\") pod \"redhat-marketplace-6xq2g\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.207537 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzxp8\" (UniqueName: \"kubernetes.io/projected/124fe4ca-8f5e-4405-97c5-edf3963a1534-kube-api-access-fzxp8\") pod \"redhat-marketplace-6xq2g\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.282049 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.798721 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xq2g"] Feb 15 17:54:02 crc kubenswrapper[4585]: I0215 17:54:02.854140 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xq2g" event={"ID":"124fe4ca-8f5e-4405-97c5-edf3963a1534","Type":"ContainerStarted","Data":"8415ac392d97878df94c9a48ba5b5e0ff1628e3d306b4ac1beb0fd4af8df19f7"} Feb 15 17:54:03 crc kubenswrapper[4585]: I0215 17:54:03.861440 4585 generic.go:334] "Generic (PLEG): container finished" podID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerID="e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453" exitCode=0 Feb 15 17:54:03 crc kubenswrapper[4585]: I0215 17:54:03.861497 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xq2g" event={"ID":"124fe4ca-8f5e-4405-97c5-edf3963a1534","Type":"ContainerDied","Data":"e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453"} Feb 15 17:54:03 crc kubenswrapper[4585]: I0215 17:54:03.863704 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 15 17:54:05 crc kubenswrapper[4585]: I0215 17:54:05.882956 4585 generic.go:334] "Generic (PLEG): container finished" podID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerID="bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d" exitCode=0 Feb 15 17:54:05 crc kubenswrapper[4585]: I0215 17:54:05.883021 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xq2g" event={"ID":"124fe4ca-8f5e-4405-97c5-edf3963a1534","Type":"ContainerDied","Data":"bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d"} Feb 15 17:54:06 crc kubenswrapper[4585]: I0215 17:54:06.903581 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xq2g" event={"ID":"124fe4ca-8f5e-4405-97c5-edf3963a1534","Type":"ContainerStarted","Data":"e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec"} Feb 15 17:54:06 crc kubenswrapper[4585]: I0215 17:54:06.938311 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6xq2g" podStartSLOduration=3.558799308 podStartE2EDuration="5.938297877s" podCreationTimestamp="2026-02-15 17:54:01 +0000 UTC" firstStartedPulling="2026-02-15 17:54:03.863446426 +0000 UTC m=+2899.806854558" lastFinishedPulling="2026-02-15 17:54:06.242944995 +0000 UTC m=+2902.186353127" observedRunningTime="2026-02-15 17:54:06.936245992 +0000 UTC m=+2902.879654124" watchObservedRunningTime="2026-02-15 17:54:06.938297877 +0000 UTC m=+2902.881706009" Feb 15 17:54:10 crc kubenswrapper[4585]: I0215 17:54:10.841968 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:54:10 crc kubenswrapper[4585]: E0215 17:54:10.842826 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:54:12 crc kubenswrapper[4585]: I0215 17:54:12.282695 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:12 crc kubenswrapper[4585]: I0215 17:54:12.283266 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:12 crc kubenswrapper[4585]: I0215 17:54:12.337703 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:13 crc kubenswrapper[4585]: I0215 17:54:13.013269 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:13 crc kubenswrapper[4585]: I0215 17:54:13.064305 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xq2g"] Feb 15 17:54:14 crc kubenswrapper[4585]: I0215 17:54:14.987917 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6xq2g" podUID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerName="registry-server" containerID="cri-o://e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec" gracePeriod=2 Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.527139 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.612160 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-utilities\") pod \"124fe4ca-8f5e-4405-97c5-edf3963a1534\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.612640 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzxp8\" (UniqueName: \"kubernetes.io/projected/124fe4ca-8f5e-4405-97c5-edf3963a1534-kube-api-access-fzxp8\") pod \"124fe4ca-8f5e-4405-97c5-edf3963a1534\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.612694 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-catalog-content\") pod \"124fe4ca-8f5e-4405-97c5-edf3963a1534\" (UID: \"124fe4ca-8f5e-4405-97c5-edf3963a1534\") " Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.613641 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-utilities" (OuterVolumeSpecName: "utilities") pod "124fe4ca-8f5e-4405-97c5-edf3963a1534" (UID: "124fe4ca-8f5e-4405-97c5-edf3963a1534"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.635061 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124fe4ca-8f5e-4405-97c5-edf3963a1534-kube-api-access-fzxp8" (OuterVolumeSpecName: "kube-api-access-fzxp8") pod "124fe4ca-8f5e-4405-97c5-edf3963a1534" (UID: "124fe4ca-8f5e-4405-97c5-edf3963a1534"). InnerVolumeSpecName "kube-api-access-fzxp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.643975 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "124fe4ca-8f5e-4405-97c5-edf3963a1534" (UID: "124fe4ca-8f5e-4405-97c5-edf3963a1534"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.715326 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.715379 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzxp8\" (UniqueName: \"kubernetes.io/projected/124fe4ca-8f5e-4405-97c5-edf3963a1534-kube-api-access-fzxp8\") on node \"crc\" DevicePath \"\"" Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.715397 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124fe4ca-8f5e-4405-97c5-edf3963a1534-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.999956 4585 generic.go:334] "Generic (PLEG): container finished" podID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerID="e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec" exitCode=0 Feb 15 17:54:15 crc kubenswrapper[4585]: I0215 17:54:15.999995 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xq2g" event={"ID":"124fe4ca-8f5e-4405-97c5-edf3963a1534","Type":"ContainerDied","Data":"e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec"} Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.000019 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xq2g" event={"ID":"124fe4ca-8f5e-4405-97c5-edf3963a1534","Type":"ContainerDied","Data":"8415ac392d97878df94c9a48ba5b5e0ff1628e3d306b4ac1beb0fd4af8df19f7"} Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.000037 4585 scope.go:117] "RemoveContainer" containerID="e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.000143 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xq2g" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.033747 4585 scope.go:117] "RemoveContainer" containerID="bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.103134 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xq2g"] Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.125437 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xq2g"] Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.137217 4585 scope.go:117] "RemoveContainer" containerID="e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.204753 4585 scope.go:117] "RemoveContainer" containerID="e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec" Feb 15 17:54:16 crc kubenswrapper[4585]: E0215 17:54:16.217850 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec\": container with ID starting with e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec not found: ID does not exist" containerID="e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.217913 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec"} err="failed to get container status \"e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec\": rpc error: code = NotFound desc = could not find container \"e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec\": container with ID starting with e43f0b3facd18f48619b2f397c65b5ddf834e2c64cfee3f165dc44e1992691ec not found: ID does not exist" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.217943 4585 scope.go:117] "RemoveContainer" containerID="bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d" Feb 15 17:54:16 crc kubenswrapper[4585]: E0215 17:54:16.241508 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d\": container with ID starting with bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d not found: ID does not exist" containerID="bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.241560 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d"} err="failed to get container status \"bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d\": rpc error: code = NotFound desc = could not find container \"bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d\": container with ID starting with bf453ba3717691575ac290356a0e0716ed448e993d2ccdaffa50f606b4c1028d not found: ID does not exist" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.241586 4585 scope.go:117] "RemoveContainer" containerID="e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453" Feb 15 17:54:16 crc kubenswrapper[4585]: E0215 17:54:16.245056 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453\": container with ID starting with e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453 not found: ID does not exist" containerID="e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.245102 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453"} err="failed to get container status \"e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453\": rpc error: code = NotFound desc = could not find container \"e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453\": container with ID starting with e52bbfb9c928f8079ba939c373c1e308479b3873acb5964d36b2ebbcf9225453 not found: ID does not exist" Feb 15 17:54:16 crc kubenswrapper[4585]: I0215 17:54:16.858016 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124fe4ca-8f5e-4405-97c5-edf3963a1534" path="/var/lib/kubelet/pods/124fe4ca-8f5e-4405-97c5-edf3963a1534/volumes" Feb 15 17:54:23 crc kubenswrapper[4585]: I0215 17:54:23.842020 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:54:23 crc kubenswrapper[4585]: E0215 17:54:23.842658 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:54:25 crc kubenswrapper[4585]: I0215 17:54:25.941816 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-pk5r2_8b78ad86-3000-4ba4-b544-09b5890298e9/controller/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.020423 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-pk5r2_8b78ad86-3000-4ba4-b544-09b5890298e9/kube-rbac-proxy/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.165117 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-frr-files/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.368261 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-reloader/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.369971 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-frr-files/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.417732 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-reloader/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.438245 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-metrics/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.637013 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-frr-files/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.672089 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-reloader/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.698754 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-metrics/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.721059 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-metrics/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.912651 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-reloader/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.962619 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-frr-files/0.log" Feb 15 17:54:26 crc kubenswrapper[4585]: I0215 17:54:26.996901 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/cp-metrics/0.log" Feb 15 17:54:27 crc kubenswrapper[4585]: I0215 17:54:27.088168 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/controller/0.log" Feb 15 17:54:27 crc kubenswrapper[4585]: I0215 17:54:27.191199 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/frr-metrics/0.log" Feb 15 17:54:27 crc kubenswrapper[4585]: I0215 17:54:27.196089 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/kube-rbac-proxy/0.log" Feb 15 17:54:27 crc kubenswrapper[4585]: I0215 17:54:27.381361 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/kube-rbac-proxy-frr/0.log" Feb 15 17:54:27 crc kubenswrapper[4585]: I0215 17:54:27.439768 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/reloader/0.log" Feb 15 17:54:27 crc kubenswrapper[4585]: I0215 17:54:27.643978 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-dtjq8_4d563416-a6d9-452b-818e-6129a4343937/frr-k8s-webhook-server/0.log" Feb 15 17:54:27 crc kubenswrapper[4585]: I0215 17:54:27.896487 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-769664b7f6-hp4wp_0005f5d7-fa80-4a9c-90d4-dbe50d95a235/manager/0.log" Feb 15 17:54:27 crc kubenswrapper[4585]: I0215 17:54:27.952720 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nprl8_89f3a2fa-b24d-4bbc-ad36-9727e45e2e52/frr/0.log" Feb 15 17:54:27 crc kubenswrapper[4585]: I0215 17:54:27.979328 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-577b585f8-g2ggz_3753870e-944b-4c36-aa33-deb44f4ccb64/webhook-server/0.log" Feb 15 17:54:28 crc kubenswrapper[4585]: I0215 17:54:28.123329 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mhjlp_34c11c17-aea3-4d9f-8a9b-f84da1d1a1af/kube-rbac-proxy/0.log" Feb 15 17:54:28 crc kubenswrapper[4585]: I0215 17:54:28.408249 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-mhjlp_34c11c17-aea3-4d9f-8a9b-f84da1d1a1af/speaker/0.log" Feb 15 17:54:37 crc kubenswrapper[4585]: I0215 17:54:37.843937 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:54:37 crc kubenswrapper[4585]: E0215 17:54:37.844470 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:54:43 crc kubenswrapper[4585]: I0215 17:54:43.507695 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll_64b64630-6199-44c9-811f-4bba668cf494/util/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.036209 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll_64b64630-6199-44c9-811f-4bba668cf494/pull/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.069268 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll_64b64630-6199-44c9-811f-4bba668cf494/pull/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.080759 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll_64b64630-6199-44c9-811f-4bba668cf494/util/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.266638 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll_64b64630-6199-44c9-811f-4bba668cf494/util/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.267646 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll_64b64630-6199-44c9-811f-4bba668cf494/pull/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.276575 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213k87ll_64b64630-6199-44c9-811f-4bba668cf494/extract/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.492065 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rvmlr_65dc013a-2295-442e-b092-e7735bd01de9/extract-utilities/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.664856 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rvmlr_65dc013a-2295-442e-b092-e7735bd01de9/extract-content/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.670806 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rvmlr_65dc013a-2295-442e-b092-e7735bd01de9/extract-utilities/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.750476 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rvmlr_65dc013a-2295-442e-b092-e7735bd01de9/extract-content/0.log" Feb 15 17:54:44 crc kubenswrapper[4585]: I0215 17:54:44.904717 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rvmlr_65dc013a-2295-442e-b092-e7735bd01de9/extract-utilities/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.007575 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rvmlr_65dc013a-2295-442e-b092-e7735bd01de9/extract-content/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.235756 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-44gbl_b2d833c8-fb60-4668-82c2-fc2fbb186540/extract-utilities/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.238016 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rvmlr_65dc013a-2295-442e-b092-e7735bd01de9/registry-server/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.425286 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-44gbl_b2d833c8-fb60-4668-82c2-fc2fbb186540/extract-utilities/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.450810 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-44gbl_b2d833c8-fb60-4668-82c2-fc2fbb186540/extract-content/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.486118 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-44gbl_b2d833c8-fb60-4668-82c2-fc2fbb186540/extract-content/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.685928 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-44gbl_b2d833c8-fb60-4668-82c2-fc2fbb186540/extract-utilities/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.694882 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-44gbl_b2d833c8-fb60-4668-82c2-fc2fbb186540/extract-content/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.737334 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-44gbl_b2d833c8-fb60-4668-82c2-fc2fbb186540/registry-server/0.log" Feb 15 17:54:45 crc kubenswrapper[4585]: I0215 17:54:45.876351 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5fsfv_ac617faa-373e-4ae0-8fe0-bbecee411a35/extract-utilities/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.040320 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5fsfv_ac617faa-373e-4ae0-8fe0-bbecee411a35/extract-content/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.042611 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5fsfv_ac617faa-373e-4ae0-8fe0-bbecee411a35/extract-utilities/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.099061 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5fsfv_ac617faa-373e-4ae0-8fe0-bbecee411a35/extract-content/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.346152 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5fsfv_ac617faa-373e-4ae0-8fe0-bbecee411a35/registry-server/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.405201 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5fsfv_ac617faa-373e-4ae0-8fe0-bbecee411a35/extract-content/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.453451 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5fsfv_ac617faa-373e-4ae0-8fe0-bbecee411a35/extract-utilities/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.563047 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6n9mz_944cfefd-484c-48e3-9f72-0ee184944d34/extract-utilities/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.739168 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6n9mz_944cfefd-484c-48e3-9f72-0ee184944d34/extract-utilities/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.801218 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6n9mz_944cfefd-484c-48e3-9f72-0ee184944d34/extract-content/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.816585 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6n9mz_944cfefd-484c-48e3-9f72-0ee184944d34/extract-content/0.log" Feb 15 17:54:46 crc kubenswrapper[4585]: I0215 17:54:46.970856 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6n9mz_944cfefd-484c-48e3-9f72-0ee184944d34/extract-utilities/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.050717 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6n9mz_944cfefd-484c-48e3-9f72-0ee184944d34/registry-server/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.098270 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6n9mz_944cfefd-484c-48e3-9f72-0ee184944d34/extract-content/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.215394 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hvhv_123edd1b-642f-43ad-a1ca-632a99f27945/extract-utilities/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.408248 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hvhv_123edd1b-642f-43ad-a1ca-632a99f27945/extract-utilities/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.452885 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hvhv_123edd1b-642f-43ad-a1ca-632a99f27945/extract-content/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.500523 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hvhv_123edd1b-642f-43ad-a1ca-632a99f27945/extract-content/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.664077 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hvhv_123edd1b-642f-43ad-a1ca-632a99f27945/extract-content/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.707939 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hvhv_123edd1b-642f-43ad-a1ca-632a99f27945/extract-utilities/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.737934 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7hvhv_123edd1b-642f-43ad-a1ca-632a99f27945/registry-server/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.804635 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85w5q_b628bbea-ba4e-447d-a27e-22a56cea0bfc/extract-utilities/0.log" Feb 15 17:54:47 crc kubenswrapper[4585]: I0215 17:54:47.980896 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85w5q_b628bbea-ba4e-447d-a27e-22a56cea0bfc/extract-content/0.log" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.004427 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85w5q_b628bbea-ba4e-447d-a27e-22a56cea0bfc/extract-utilities/0.log" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.043767 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85w5q_b628bbea-ba4e-447d-a27e-22a56cea0bfc/extract-content/0.log" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.446116 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85w5q_b628bbea-ba4e-447d-a27e-22a56cea0bfc/extract-content/0.log" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.561468 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85w5q_b628bbea-ba4e-447d-a27e-22a56cea0bfc/registry-server/0.log" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.582629 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-85w5q_b628bbea-ba4e-447d-a27e-22a56cea0bfc/extract-utilities/0.log" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.590471 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8c7xf_e1ff32f1-5c8f-4062-b6f6-5fb81667a532/extract-utilities/0.log" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.762106 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8c7xf_e1ff32f1-5c8f-4062-b6f6-5fb81667a532/extract-utilities/0.log" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.815482 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8c7xf_e1ff32f1-5c8f-4062-b6f6-5fb81667a532/extract-content/0.log" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.841650 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:54:48 crc kubenswrapper[4585]: E0215 17:54:48.841995 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:54:48 crc kubenswrapper[4585]: I0215 17:54:48.845111 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8c7xf_e1ff32f1-5c8f-4062-b6f6-5fb81667a532/extract-content/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.022947 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8c7xf_e1ff32f1-5c8f-4062-b6f6-5fb81667a532/extract-content/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.057029 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8c7xf_e1ff32f1-5c8f-4062-b6f6-5fb81667a532/extract-utilities/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.073494 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8c7xf_e1ff32f1-5c8f-4062-b6f6-5fb81667a532/registry-server/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.129014 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9h2zb_e8137661-759d-4fea-8a45-47e438c9fbe6/extract-utilities/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.292685 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9h2zb_e8137661-759d-4fea-8a45-47e438c9fbe6/extract-utilities/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.318330 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9h2zb_e8137661-759d-4fea-8a45-47e438c9fbe6/extract-content/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.345081 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9h2zb_e8137661-759d-4fea-8a45-47e438c9fbe6/extract-content/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.538990 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9h2zb_e8137661-759d-4fea-8a45-47e438c9fbe6/extract-utilities/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.624668 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9h2zb_e8137661-759d-4fea-8a45-47e438c9fbe6/extract-content/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.657415 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9h2zb_e8137661-759d-4fea-8a45-47e438c9fbe6/registry-server/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.696360 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9lz2d_0915232d-c037-4e97-9ad2-a0e1f69b913a/extract-utilities/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.906215 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9lz2d_0915232d-c037-4e97-9ad2-a0e1f69b913a/extract-content/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.928794 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9lz2d_0915232d-c037-4e97-9ad2-a0e1f69b913a/extract-utilities/0.log" Feb 15 17:54:49 crc kubenswrapper[4585]: I0215 17:54:49.936273 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9lz2d_0915232d-c037-4e97-9ad2-a0e1f69b913a/extract-content/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.172704 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9lz2d_0915232d-c037-4e97-9ad2-a0e1f69b913a/extract-content/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.178945 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9lz2d_0915232d-c037-4e97-9ad2-a0e1f69b913a/extract-utilities/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.227198 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9lz2d_0915232d-c037-4e97-9ad2-a0e1f69b913a/registry-server/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.269119 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b7rq6_a8be94f1-f28e-494b-b9c1-d97c75ea5577/extract-utilities/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.401673 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b7rq6_a8be94f1-f28e-494b-b9c1-d97c75ea5577/extract-content/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.452459 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b7rq6_a8be94f1-f28e-494b-b9c1-d97c75ea5577/extract-utilities/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.453051 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b7rq6_a8be94f1-f28e-494b-b9c1-d97c75ea5577/extract-content/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.671828 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b7rq6_a8be94f1-f28e-494b-b9c1-d97c75ea5577/extract-utilities/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.673670 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b7rq6_a8be94f1-f28e-494b-b9c1-d97c75ea5577/registry-server/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.766749 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bj95v_550f697e-9cf9-4b02-af9d-c0fe2375bd0c/extract-utilities/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.766789 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b7rq6_a8be94f1-f28e-494b-b9c1-d97c75ea5577/extract-content/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.926843 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bj95v_550f697e-9cf9-4b02-af9d-c0fe2375bd0c/extract-utilities/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.976217 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bj95v_550f697e-9cf9-4b02-af9d-c0fe2375bd0c/extract-content/0.log" Feb 15 17:54:50 crc kubenswrapper[4585]: I0215 17:54:50.979623 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bj95v_550f697e-9cf9-4b02-af9d-c0fe2375bd0c/extract-content/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.128986 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bj95v_550f697e-9cf9-4b02-af9d-c0fe2375bd0c/extract-content/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.155681 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bj95v_550f697e-9cf9-4b02-af9d-c0fe2375bd0c/registry-server/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.183718 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bj95v_550f697e-9cf9-4b02-af9d-c0fe2375bd0c/extract-utilities/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.217424 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgcn2_191919ef-1328-4e66-9194-d59f017c027f/extract-utilities/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.404139 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgcn2_191919ef-1328-4e66-9194-d59f017c027f/extract-content/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.445702 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgcn2_191919ef-1328-4e66-9194-d59f017c027f/extract-content/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.459775 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgcn2_191919ef-1328-4e66-9194-d59f017c027f/extract-utilities/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.592382 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgcn2_191919ef-1328-4e66-9194-d59f017c027f/extract-content/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.622543 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgcn2_191919ef-1328-4e66-9194-d59f017c027f/extract-utilities/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.650396 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgcn2_191919ef-1328-4e66-9194-d59f017c027f/registry-server/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.667100 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dssfd_57fd9f89-0b8e-4e77-b386-30107cd20578/extract-utilities/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.832439 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dssfd_57fd9f89-0b8e-4e77-b386-30107cd20578/extract-utilities/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.837026 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dssfd_57fd9f89-0b8e-4e77-b386-30107cd20578/extract-content/0.log" Feb 15 17:54:51 crc kubenswrapper[4585]: I0215 17:54:51.870207 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dssfd_57fd9f89-0b8e-4e77-b386-30107cd20578/extract-content/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.042742 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dssfd_57fd9f89-0b8e-4e77-b386-30107cd20578/extract-utilities/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.047506 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dssfd_57fd9f89-0b8e-4e77-b386-30107cd20578/extract-content/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.202222 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f9nwr_ec374614-bada-412d-a649-dc86e3ddaa43/extract-utilities/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.218665 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-dssfd_57fd9f89-0b8e-4e77-b386-30107cd20578/registry-server/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.305225 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f9nwr_ec374614-bada-412d-a649-dc86e3ddaa43/extract-utilities/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.330760 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f9nwr_ec374614-bada-412d-a649-dc86e3ddaa43/extract-content/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.369215 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f9nwr_ec374614-bada-412d-a649-dc86e3ddaa43/extract-content/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.560163 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f9nwr_ec374614-bada-412d-a649-dc86e3ddaa43/extract-content/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.586065 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcdtv_e442ed18-9718-4f0d-b548-4769856b3b5d/extract-utilities/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.607569 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f9nwr_ec374614-bada-412d-a649-dc86e3ddaa43/registry-server/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.632062 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f9nwr_ec374614-bada-412d-a649-dc86e3ddaa43/extract-utilities/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.862681 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcdtv_e442ed18-9718-4f0d-b548-4769856b3b5d/extract-content/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.870440 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcdtv_e442ed18-9718-4f0d-b548-4769856b3b5d/extract-content/0.log" Feb 15 17:54:52 crc kubenswrapper[4585]: I0215 17:54:52.875367 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcdtv_e442ed18-9718-4f0d-b548-4769856b3b5d/extract-utilities/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.021291 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcdtv_e442ed18-9718-4f0d-b548-4769856b3b5d/extract-utilities/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.021301 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcdtv_e442ed18-9718-4f0d-b548-4769856b3b5d/extract-content/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.083625 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-flrdq_2324ac96-77b5-46b6-a109-22380ef0475d/extract-utilities/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.175758 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcdtv_e442ed18-9718-4f0d-b548-4769856b3b5d/registry-server/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.352193 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-flrdq_2324ac96-77b5-46b6-a109-22380ef0475d/extract-utilities/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.409614 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-flrdq_2324ac96-77b5-46b6-a109-22380ef0475d/extract-content/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.411858 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-flrdq_2324ac96-77b5-46b6-a109-22380ef0475d/extract-content/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.636368 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gkd28_c258e26a-e1d4-4c57-9808-f3befbd2aab0/extract-utilities/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.665251 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-flrdq_2324ac96-77b5-46b6-a109-22380ef0475d/registry-server/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.684233 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-flrdq_2324ac96-77b5-46b6-a109-22380ef0475d/extract-content/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.709220 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-flrdq_2324ac96-77b5-46b6-a109-22380ef0475d/extract-utilities/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.885933 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gkd28_c258e26a-e1d4-4c57-9808-f3befbd2aab0/extract-utilities/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.890863 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gkd28_c258e26a-e1d4-4c57-9808-f3befbd2aab0/extract-content/0.log" Feb 15 17:54:53 crc kubenswrapper[4585]: I0215 17:54:53.910333 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gkd28_c258e26a-e1d4-4c57-9808-f3befbd2aab0/extract-content/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.073960 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gkd28_c258e26a-e1d4-4c57-9808-f3befbd2aab0/extract-utilities/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.125262 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gkd28_c258e26a-e1d4-4c57-9808-f3befbd2aab0/registry-server/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.154103 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gkd28_c258e26a-e1d4-4c57-9808-f3befbd2aab0/extract-content/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.216989 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gnzk5_2d95df95-a751-4c7e-adeb-a8d6e0db7767/extract-utilities/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.387824 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gnzk5_2d95df95-a751-4c7e-adeb-a8d6e0db7767/extract-content/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.392489 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gnzk5_2d95df95-a751-4c7e-adeb-a8d6e0db7767/extract-utilities/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.426449 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gnzk5_2d95df95-a751-4c7e-adeb-a8d6e0db7767/extract-content/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.516107 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gnzk5_2d95df95-a751-4c7e-adeb-a8d6e0db7767/extract-utilities/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.528288 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gnzk5_2d95df95-a751-4c7e-adeb-a8d6e0db7767/extract-content/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.707243 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hd7nq_c570ec68-3a6e-45fc-b218-7ac142f88dda/extract-utilities/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.917340 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hd7nq_c570ec68-3a6e-45fc-b218-7ac142f88dda/extract-content/0.log" Feb 15 17:54:54 crc kubenswrapper[4585]: I0215 17:54:54.963238 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hd7nq_c570ec68-3a6e-45fc-b218-7ac142f88dda/extract-content/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.080863 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hd7nq_c570ec68-3a6e-45fc-b218-7ac142f88dda/extract-utilities/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.233782 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hd7nq_c570ec68-3a6e-45fc-b218-7ac142f88dda/extract-utilities/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.312877 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hd7nq_c570ec68-3a6e-45fc-b218-7ac142f88dda/extract-content/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.344015 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hd7nq_c570ec68-3a6e-45fc-b218-7ac142f88dda/registry-server/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.411095 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gnzk5_2d95df95-a751-4c7e-adeb-a8d6e0db7767/registry-server/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.480278 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgz2x_886dc957-8460-4c2b-99a3-92b9d869a6cd/extract-utilities/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.691419 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgz2x_886dc957-8460-4c2b-99a3-92b9d869a6cd/extract-utilities/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.694489 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgz2x_886dc957-8460-4c2b-99a3-92b9d869a6cd/extract-content/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.698474 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgz2x_886dc957-8460-4c2b-99a3-92b9d869a6cd/extract-content/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.881815 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgz2x_886dc957-8460-4c2b-99a3-92b9d869a6cd/extract-content/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.887045 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgz2x_886dc957-8460-4c2b-99a3-92b9d869a6cd/extract-utilities/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.888510 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jp2sv_09f66f54-3515-402f-b986-e278748ef6d4/extract-utilities/0.log" Feb 15 17:54:55 crc kubenswrapper[4585]: I0215 17:54:55.992474 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jgz2x_886dc957-8460-4c2b-99a3-92b9d869a6cd/registry-server/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.147190 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jp2sv_09f66f54-3515-402f-b986-e278748ef6d4/extract-utilities/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.189434 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jp2sv_09f66f54-3515-402f-b986-e278748ef6d4/extract-content/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.220700 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jp2sv_09f66f54-3515-402f-b986-e278748ef6d4/extract-content/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.350111 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jp2sv_09f66f54-3515-402f-b986-e278748ef6d4/extract-utilities/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.446676 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jp2sv_09f66f54-3515-402f-b986-e278748ef6d4/registry-server/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.460819 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jp2sv_09f66f54-3515-402f-b986-e278748ef6d4/extract-content/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.509899 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mcd69_669fa704-8e84-4c96-abf3-11f606645785/extract-utilities/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.645448 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mcd69_669fa704-8e84-4c96-abf3-11f606645785/extract-content/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.648270 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mcd69_669fa704-8e84-4c96-abf3-11f606645785/extract-content/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.669034 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mcd69_669fa704-8e84-4c96-abf3-11f606645785/extract-utilities/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.870328 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mcd69_669fa704-8e84-4c96-abf3-11f606645785/extract-content/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.884754 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mcd69_669fa704-8e84-4c96-abf3-11f606645785/extract-utilities/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.913722 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nxp97_f5cad97d-c68c-4859-bc2b-746461b8b361/extract-utilities/0.log" Feb 15 17:54:56 crc kubenswrapper[4585]: I0215 17:54:56.981609 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mcd69_669fa704-8e84-4c96-abf3-11f606645785/registry-server/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.160996 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nxp97_f5cad97d-c68c-4859-bc2b-746461b8b361/extract-utilities/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.170925 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nxp97_f5cad97d-c68c-4859-bc2b-746461b8b361/extract-content/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.172925 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nxp97_f5cad97d-c68c-4859-bc2b-746461b8b361/extract-content/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.323902 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nxp97_f5cad97d-c68c-4859-bc2b-746461b8b361/extract-utilities/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.397747 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nxp97_f5cad97d-c68c-4859-bc2b-746461b8b361/registry-server/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.415627 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nxp97_f5cad97d-c68c-4859-bc2b-746461b8b361/extract-content/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.475026 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p2lqh_996d261a-6fd6-44a8-bb7b-78e9f2024a05/extract-utilities/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.626430 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p2lqh_996d261a-6fd6-44a8-bb7b-78e9f2024a05/extract-utilities/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.634168 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p2lqh_996d261a-6fd6-44a8-bb7b-78e9f2024a05/extract-content/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.675407 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p2lqh_996d261a-6fd6-44a8-bb7b-78e9f2024a05/extract-content/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.865245 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p2lqh_996d261a-6fd6-44a8-bb7b-78e9f2024a05/registry-server/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.880357 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p2lqh_996d261a-6fd6-44a8-bb7b-78e9f2024a05/extract-content/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.880476 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p2lqh_996d261a-6fd6-44a8-bb7b-78e9f2024a05/extract-utilities/0.log" Feb 15 17:54:57 crc kubenswrapper[4585]: I0215 17:54:57.945337 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q5q42_817eac9e-3482-4350-bc11-f58ec7bad74c/extract-utilities/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.109698 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q5q42_817eac9e-3482-4350-bc11-f58ec7bad74c/extract-utilities/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.136478 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q5q42_817eac9e-3482-4350-bc11-f58ec7bad74c/extract-content/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.139113 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q5q42_817eac9e-3482-4350-bc11-f58ec7bad74c/extract-content/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.293362 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q5q42_817eac9e-3482-4350-bc11-f58ec7bad74c/extract-utilities/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.348162 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q5q42_817eac9e-3482-4350-bc11-f58ec7bad74c/registry-server/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.362754 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q5q42_817eac9e-3482-4350-bc11-f58ec7bad74c/extract-content/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.406922 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r4fvv_6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b/extract-utilities/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.651914 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r4fvv_6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b/extract-utilities/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.742778 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r4fvv_6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b/extract-content/0.log" Feb 15 17:54:58 crc kubenswrapper[4585]: I0215 17:54:58.762067 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r4fvv_6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b/extract-content/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.163115 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r4fvv_6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b/extract-utilities/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.163765 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r4fvv_6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b/extract-content/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.191663 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rkrdk_db06415c-64a8-4c80-8fab-d528d407d10a/extract-utilities/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.340456 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r4fvv_6abf8669-1bd8-4d9a-8926-a3b4c3dc7d0b/registry-server/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.486851 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rkrdk_db06415c-64a8-4c80-8fab-d528d407d10a/extract-content/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.516148 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rkrdk_db06415c-64a8-4c80-8fab-d528d407d10a/extract-utilities/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.539892 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rkrdk_db06415c-64a8-4c80-8fab-d528d407d10a/extract-content/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.658800 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rkrdk_db06415c-64a8-4c80-8fab-d528d407d10a/extract-content/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.679549 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rkrdk_db06415c-64a8-4c80-8fab-d528d407d10a/extract-utilities/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.766024 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rkrdk_db06415c-64a8-4c80-8fab-d528d407d10a/registry-server/0.log" Feb 15 17:54:59 crc kubenswrapper[4585]: I0215 17:54:59.855655 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-spm6s_ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe/extract-utilities/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.000148 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-spm6s_ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe/extract-utilities/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.004980 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-spm6s_ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe/extract-content/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.005289 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-spm6s_ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe/extract-content/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.213327 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-spm6s_ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe/extract-utilities/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.271902 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-spm6s_ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe/extract-content/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.323575 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-spm6s_ad8c0d7b-9e86-4762-a59b-6e6bc4186bfe/registry-server/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.355388 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szkmh_4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd/extract-utilities/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.490956 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szkmh_4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd/extract-utilities/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.516977 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szkmh_4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd/extract-content/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.540128 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szkmh_4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd/extract-content/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.709996 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szkmh_4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd/extract-utilities/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.729214 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szkmh_4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd/extract-content/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.797444 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-szkmh_4aaf02c8-1b75-48a7-a9a0-891f45e7a9dd/registry-server/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.861571 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tcrgd_e06966bc-b5b4-4b56-b7e0-ecd065633b99/extract-utilities/0.log" Feb 15 17:55:00 crc kubenswrapper[4585]: I0215 17:55:00.987090 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tcrgd_e06966bc-b5b4-4b56-b7e0-ecd065633b99/extract-utilities/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.034164 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tcrgd_e06966bc-b5b4-4b56-b7e0-ecd065633b99/extract-content/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.055341 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tcrgd_e06966bc-b5b4-4b56-b7e0-ecd065633b99/extract-content/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.222045 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tcrgd_e06966bc-b5b4-4b56-b7e0-ecd065633b99/extract-utilities/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.232347 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tcrgd_e06966bc-b5b4-4b56-b7e0-ecd065633b99/extract-content/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.289371 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tcrgd_e06966bc-b5b4-4b56-b7e0-ecd065633b99/registry-server/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.305948 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tvwl7_b92116ae-2b01-4062-ba27-ce0ad95d9b83/extract-utilities/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.528968 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tvwl7_b92116ae-2b01-4062-ba27-ce0ad95d9b83/extract-utilities/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.535623 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tvwl7_b92116ae-2b01-4062-ba27-ce0ad95d9b83/extract-content/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.559330 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tvwl7_b92116ae-2b01-4062-ba27-ce0ad95d9b83/extract-content/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.759822 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tvwl7_b92116ae-2b01-4062-ba27-ce0ad95d9b83/extract-utilities/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.778199 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vcnr2_62d2ce7a-0eba-4d09-8c96-260ff647e3b2/extract-utilities/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.837693 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tvwl7_b92116ae-2b01-4062-ba27-ce0ad95d9b83/registry-server/0.log" Feb 15 17:55:01 crc kubenswrapper[4585]: I0215 17:55:01.845550 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-tvwl7_b92116ae-2b01-4062-ba27-ce0ad95d9b83/extract-content/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.028235 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vcnr2_62d2ce7a-0eba-4d09-8c96-260ff647e3b2/extract-utilities/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.031551 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vcnr2_62d2ce7a-0eba-4d09-8c96-260ff647e3b2/extract-content/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.068464 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vcnr2_62d2ce7a-0eba-4d09-8c96-260ff647e3b2/extract-content/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.225469 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vcnr2_62d2ce7a-0eba-4d09-8c96-260ff647e3b2/extract-content/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.308823 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vcnr2_62d2ce7a-0eba-4d09-8c96-260ff647e3b2/extract-utilities/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.349453 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vwfmx_372e5a42-5ade-4cc6-a41f-72d5119f2e64/extract-utilities/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.371407 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vcnr2_62d2ce7a-0eba-4d09-8c96-260ff647e3b2/registry-server/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.676058 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vwfmx_372e5a42-5ade-4cc6-a41f-72d5119f2e64/extract-content/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.706170 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vwfmx_372e5a42-5ade-4cc6-a41f-72d5119f2e64/extract-content/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.767539 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vwfmx_372e5a42-5ade-4cc6-a41f-72d5119f2e64/extract-utilities/0.log" Feb 15 17:55:02 crc kubenswrapper[4585]: I0215 17:55:02.847284 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:55:02 crc kubenswrapper[4585]: E0215 17:55:02.847492 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:55:03 crc kubenswrapper[4585]: I0215 17:55:03.090694 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vwfmx_372e5a42-5ade-4cc6-a41f-72d5119f2e64/extract-content/0.log" Feb 15 17:55:03 crc kubenswrapper[4585]: I0215 17:55:03.104308 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vwfmx_372e5a42-5ade-4cc6-a41f-72d5119f2e64/extract-utilities/0.log" Feb 15 17:55:03 crc kubenswrapper[4585]: I0215 17:55:03.207481 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-whfqg_e8336aa7-5445-4d29-a676-e4a588683b09/extract-utilities/0.log" Feb 15 17:55:03 crc kubenswrapper[4585]: I0215 17:55:03.263338 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vwfmx_372e5a42-5ade-4cc6-a41f-72d5119f2e64/registry-server/0.log" Feb 15 17:55:03 crc kubenswrapper[4585]: I0215 17:55:03.391406 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-whfqg_e8336aa7-5445-4d29-a676-e4a588683b09/extract-content/0.log" Feb 15 17:55:03 crc kubenswrapper[4585]: I0215 17:55:03.408993 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-whfqg_e8336aa7-5445-4d29-a676-e4a588683b09/extract-content/0.log" Feb 15 17:55:03 crc kubenswrapper[4585]: I0215 17:55:03.462007 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-whfqg_e8336aa7-5445-4d29-a676-e4a588683b09/extract-utilities/0.log" Feb 15 17:55:03 crc kubenswrapper[4585]: I0215 17:55:03.951406 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-whfqg_e8336aa7-5445-4d29-a676-e4a588683b09/extract-utilities/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.034555 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-whfqg_e8336aa7-5445-4d29-a676-e4a588683b09/registry-server/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.062565 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x88c9_102ceb8e-9746-4512-b8c9-b00cec81b1a8/extract-utilities/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.062968 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-whfqg_e8336aa7-5445-4d29-a676-e4a588683b09/extract-content/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.261058 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x88c9_102ceb8e-9746-4512-b8c9-b00cec81b1a8/extract-content/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.262822 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x88c9_102ceb8e-9746-4512-b8c9-b00cec81b1a8/extract-content/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.284730 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x88c9_102ceb8e-9746-4512-b8c9-b00cec81b1a8/extract-utilities/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.495674 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x88c9_102ceb8e-9746-4512-b8c9-b00cec81b1a8/extract-utilities/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.508797 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x88c9_102ceb8e-9746-4512-b8c9-b00cec81b1a8/extract-content/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.607182 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xblvr_26ccbe5c-984a-4e61-9897-6357cdd14cab/extract-utilities/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.617649 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x88c9_102ceb8e-9746-4512-b8c9-b00cec81b1a8/registry-server/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.762970 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xblvr_26ccbe5c-984a-4e61-9897-6357cdd14cab/extract-utilities/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.808187 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xblvr_26ccbe5c-984a-4e61-9897-6357cdd14cab/extract-content/0.log" Feb 15 17:55:04 crc kubenswrapper[4585]: I0215 17:55:04.831892 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xblvr_26ccbe5c-984a-4e61-9897-6357cdd14cab/extract-content/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.043327 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xblvr_26ccbe5c-984a-4e61-9897-6357cdd14cab/extract-utilities/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.062804 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xblvr_26ccbe5c-984a-4e61-9897-6357cdd14cab/extract-content/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.073373 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xntlx_54df0770-f532-412f-985c-fa6514121ecf/extract-utilities/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.172348 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xblvr_26ccbe5c-984a-4e61-9897-6357cdd14cab/registry-server/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.311562 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xntlx_54df0770-f532-412f-985c-fa6514121ecf/extract-utilities/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.325249 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xntlx_54df0770-f532-412f-985c-fa6514121ecf/extract-content/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.353219 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xntlx_54df0770-f532-412f-985c-fa6514121ecf/extract-content/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.555521 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xntlx_54df0770-f532-412f-985c-fa6514121ecf/extract-content/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.573343 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xntlx_54df0770-f532-412f-985c-fa6514121ecf/extract-utilities/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.620437 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xqp7w_ec1e8351-6185-4fdb-b97b-29efaea499cd/extract-utilities/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.622091 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xntlx_54df0770-f532-412f-985c-fa6514121ecf/registry-server/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.813919 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xqp7w_ec1e8351-6185-4fdb-b97b-29efaea499cd/extract-utilities/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.836040 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xqp7w_ec1e8351-6185-4fdb-b97b-29efaea499cd/extract-content/0.log" Feb 15 17:55:05 crc kubenswrapper[4585]: I0215 17:55:05.865218 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xqp7w_ec1e8351-6185-4fdb-b97b-29efaea499cd/extract-content/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.021312 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xqp7w_ec1e8351-6185-4fdb-b97b-29efaea499cd/extract-utilities/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.057372 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xqp7w_ec1e8351-6185-4fdb-b97b-29efaea499cd/extract-content/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.146102 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xs7w7_90aac239-5f41-437c-8bc1-a9164637e556/extract-utilities/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.176756 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xqp7w_ec1e8351-6185-4fdb-b97b-29efaea499cd/registry-server/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.388872 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xs7w7_90aac239-5f41-437c-8bc1-a9164637e556/extract-content/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.392113 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xs7w7_90aac239-5f41-437c-8bc1-a9164637e556/extract-content/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.441291 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xs7w7_90aac239-5f41-437c-8bc1-a9164637e556/extract-utilities/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.600554 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xs7w7_90aac239-5f41-437c-8bc1-a9164637e556/extract-utilities/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.639015 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xs7w7_90aac239-5f41-437c-8bc1-a9164637e556/extract-content/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.681057 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xs7w7_90aac239-5f41-437c-8bc1-a9164637e556/registry-server/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.700204 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk_ecd88680-0e48-4c4f-9568-9f63dcd3e127/util/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.835347 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk_ecd88680-0e48-4c4f-9568-9f63dcd3e127/pull/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.843563 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk_ecd88680-0e48-4c4f-9568-9f63dcd3e127/util/0.log" Feb 15 17:55:06 crc kubenswrapper[4585]: I0215 17:55:06.883981 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk_ecd88680-0e48-4c4f-9568-9f63dcd3e127/pull/0.log" Feb 15 17:55:07 crc kubenswrapper[4585]: I0215 17:55:07.113864 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk_ecd88680-0e48-4c4f-9568-9f63dcd3e127/util/0.log" Feb 15 17:55:07 crc kubenswrapper[4585]: I0215 17:55:07.114726 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk_ecd88680-0e48-4c4f-9568-9f63dcd3e127/pull/0.log" Feb 15 17:55:07 crc kubenswrapper[4585]: I0215 17:55:07.235170 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecakfdbk_ecd88680-0e48-4c4f-9568-9f63dcd3e127/extract/0.log" Feb 15 17:55:07 crc kubenswrapper[4585]: I0215 17:55:07.430249 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-h2jdb_78351f70-0518-4b26-b551-48b047371fa7/marketplace-operator/0.log" Feb 15 17:55:07 crc kubenswrapper[4585]: I0215 17:55:07.561084 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-48nwt_7174606f-3071-43e5-88ce-c9e22d408b2e/extract-utilities/0.log" Feb 15 17:55:07 crc kubenswrapper[4585]: I0215 17:55:07.792957 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-48nwt_7174606f-3071-43e5-88ce-c9e22d408b2e/extract-content/0.log" Feb 15 17:55:07 crc kubenswrapper[4585]: I0215 17:55:07.793102 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-48nwt_7174606f-3071-43e5-88ce-c9e22d408b2e/extract-content/0.log" Feb 15 17:55:07 crc kubenswrapper[4585]: I0215 17:55:07.809902 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-48nwt_7174606f-3071-43e5-88ce-c9e22d408b2e/extract-utilities/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.096429 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-48nwt_7174606f-3071-43e5-88ce-c9e22d408b2e/extract-utilities/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.113148 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-48nwt_7174606f-3071-43e5-88ce-c9e22d408b2e/registry-server/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.117579 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6sll_f615e254-695a-452c-86c3-312b4dbabdb1/extract-utilities/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.179663 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-48nwt_7174606f-3071-43e5-88ce-c9e22d408b2e/extract-content/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.345713 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6sll_f615e254-695a-452c-86c3-312b4dbabdb1/extract-content/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.356698 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6sll_f615e254-695a-452c-86c3-312b4dbabdb1/extract-utilities/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.383080 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6sll_f615e254-695a-452c-86c3-312b4dbabdb1/extract-content/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.614799 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6sll_f615e254-695a-452c-86c3-312b4dbabdb1/extract-utilities/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.709524 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6sll_f615e254-695a-452c-86c3-312b4dbabdb1/extract-content/0.log" Feb 15 17:55:08 crc kubenswrapper[4585]: I0215 17:55:08.980439 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6sll_f615e254-695a-452c-86c3-312b4dbabdb1/registry-server/0.log" Feb 15 17:55:13 crc kubenswrapper[4585]: I0215 17:55:13.842556 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:55:13 crc kubenswrapper[4585]: E0215 17:55:13.843181 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:55:24 crc kubenswrapper[4585]: I0215 17:55:24.853575 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:55:24 crc kubenswrapper[4585]: E0215 17:55:24.854356 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:55:35 crc kubenswrapper[4585]: I0215 17:55:35.842454 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:55:35 crc kubenswrapper[4585]: E0215 17:55:35.844070 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:55:47 crc kubenswrapper[4585]: I0215 17:55:47.842006 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:55:47 crc kubenswrapper[4585]: E0215 17:55:47.842888 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:56:01 crc kubenswrapper[4585]: I0215 17:56:01.841940 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:56:01 crc kubenswrapper[4585]: E0215 17:56:01.842774 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:56:16 crc kubenswrapper[4585]: I0215 17:56:16.842227 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:56:16 crc kubenswrapper[4585]: E0215 17:56:16.843001 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:56:28 crc kubenswrapper[4585]: I0215 17:56:28.854042 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:56:28 crc kubenswrapper[4585]: E0215 17:56:28.855361 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:56:43 crc kubenswrapper[4585]: I0215 17:56:43.843269 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:56:43 crc kubenswrapper[4585]: E0215 17:56:43.844005 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:56:46 crc kubenswrapper[4585]: I0215 17:56:46.665037 4585 generic.go:334] "Generic (PLEG): container finished" podID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" containerID="f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284" exitCode=0 Feb 15 17:56:46 crc kubenswrapper[4585]: I0215 17:56:46.665151 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8bz9w/must-gather-47drt" event={"ID":"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6","Type":"ContainerDied","Data":"f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284"} Feb 15 17:56:46 crc kubenswrapper[4585]: I0215 17:56:46.666768 4585 scope.go:117] "RemoveContainer" containerID="f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284" Feb 15 17:56:47 crc kubenswrapper[4585]: I0215 17:56:47.677642 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8bz9w_must-gather-47drt_ba76fcf9-3d5b-40db-94ef-09a5c53e75a6/gather/0.log" Feb 15 17:56:55 crc kubenswrapper[4585]: I0215 17:56:55.910967 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8bz9w/must-gather-47drt"] Feb 15 17:56:55 crc kubenswrapper[4585]: I0215 17:56:55.911655 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8bz9w/must-gather-47drt" podUID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" containerName="copy" containerID="cri-o://390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665" gracePeriod=2 Feb 15 17:56:55 crc kubenswrapper[4585]: I0215 17:56:55.922153 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8bz9w/must-gather-47drt"] Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.382115 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8bz9w_must-gather-47drt_ba76fcf9-3d5b-40db-94ef-09a5c53e75a6/copy/0.log" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.382936 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.442707 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-must-gather-output\") pod \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\" (UID: \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\") " Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.443185 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29zqr\" (UniqueName: \"kubernetes.io/projected/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-kube-api-access-29zqr\") pod \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\" (UID: \"ba76fcf9-3d5b-40db-94ef-09a5c53e75a6\") " Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.464032 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-kube-api-access-29zqr" (OuterVolumeSpecName: "kube-api-access-29zqr") pod "ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" (UID: "ba76fcf9-3d5b-40db-94ef-09a5c53e75a6"). InnerVolumeSpecName "kube-api-access-29zqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.545172 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29zqr\" (UniqueName: \"kubernetes.io/projected/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-kube-api-access-29zqr\") on node \"crc\" DevicePath \"\"" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.600803 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" (UID: "ba76fcf9-3d5b-40db-94ef-09a5c53e75a6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.647723 4585 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.784868 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8bz9w_must-gather-47drt_ba76fcf9-3d5b-40db-94ef-09a5c53e75a6/copy/0.log" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.785173 4585 generic.go:334] "Generic (PLEG): container finished" podID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" containerID="390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665" exitCode=143 Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.785223 4585 scope.go:117] "RemoveContainer" containerID="390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.785269 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8bz9w/must-gather-47drt" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.808495 4585 scope.go:117] "RemoveContainer" containerID="f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.841975 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:56:56 crc kubenswrapper[4585]: E0215 17:56:56.842403 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.862157 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" path="/var/lib/kubelet/pods/ba76fcf9-3d5b-40db-94ef-09a5c53e75a6/volumes" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.903747 4585 scope.go:117] "RemoveContainer" containerID="390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665" Feb 15 17:56:56 crc kubenswrapper[4585]: E0215 17:56:56.905831 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665\": container with ID starting with 390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665 not found: ID does not exist" containerID="390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.905861 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665"} err="failed to get container status \"390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665\": rpc error: code = NotFound desc = could not find container \"390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665\": container with ID starting with 390389fc1c6c4b4ca139baba7117e221011362e69e3a0fcb6730ec22f83ec665 not found: ID does not exist" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.905882 4585 scope.go:117] "RemoveContainer" containerID="f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284" Feb 15 17:56:56 crc kubenswrapper[4585]: E0215 17:56:56.906174 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284\": container with ID starting with f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284 not found: ID does not exist" containerID="f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284" Feb 15 17:56:56 crc kubenswrapper[4585]: I0215 17:56:56.906197 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284"} err="failed to get container status \"f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284\": rpc error: code = NotFound desc = could not find container \"f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284\": container with ID starting with f11c7f3fa4f68c524fb17de1839c2ec85adb3296db2d2d69b75e223c98e48284 not found: ID does not exist" Feb 15 17:57:11 crc kubenswrapper[4585]: I0215 17:57:11.841628 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:57:11 crc kubenswrapper[4585]: E0215 17:57:11.842231 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:57:26 crc kubenswrapper[4585]: I0215 17:57:26.842375 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:57:26 crc kubenswrapper[4585]: E0215 17:57:26.843475 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:57:41 crc kubenswrapper[4585]: I0215 17:57:41.842636 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:57:41 crc kubenswrapper[4585]: E0215 17:57:41.843678 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:57:55 crc kubenswrapper[4585]: I0215 17:57:55.842855 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:57:55 crc kubenswrapper[4585]: E0215 17:57:55.843775 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:58:06 crc kubenswrapper[4585]: I0215 17:58:06.841804 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:58:06 crc kubenswrapper[4585]: E0215 17:58:06.843019 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:58:19 crc kubenswrapper[4585]: I0215 17:58:19.844673 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:58:19 crc kubenswrapper[4585]: E0215 17:58:19.845377 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:58:32 crc kubenswrapper[4585]: I0215 17:58:32.842655 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:58:32 crc kubenswrapper[4585]: E0215 17:58:32.843376 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:58:45 crc kubenswrapper[4585]: I0215 17:58:45.842023 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:58:45 crc kubenswrapper[4585]: E0215 17:58:45.842671 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hptv_openshift-machine-config-operator(0c41aeb2-e722-4379-b7d6-fe499719f9d2)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" Feb 15 17:58:57 crc kubenswrapper[4585]: I0215 17:58:57.843153 4585 scope.go:117] "RemoveContainer" containerID="667f9b3ab65136d55bc6a3516b4c60242b77e8ce534e905cdf56b1fa36972e44" Feb 15 17:58:58 crc kubenswrapper[4585]: I0215 17:58:58.304353 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" event={"ID":"0c41aeb2-e722-4379-b7d6-fe499719f9d2","Type":"ContainerStarted","Data":"aa70d44b0f0b7781602a7c16fe5654515d881e058dd1182760a8806deb26dd20"} Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.559989 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ghnhq"] Feb 15 17:59:08 crc kubenswrapper[4585]: E0215 17:59:08.560861 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerName="registry-server" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.560875 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerName="registry-server" Feb 15 17:59:08 crc kubenswrapper[4585]: E0215 17:59:08.560883 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" containerName="copy" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.560890 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" containerName="copy" Feb 15 17:59:08 crc kubenswrapper[4585]: E0215 17:59:08.560908 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerName="extract-utilities" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.560914 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerName="extract-utilities" Feb 15 17:59:08 crc kubenswrapper[4585]: E0215 17:59:08.560930 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" containerName="gather" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.560936 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" containerName="gather" Feb 15 17:59:08 crc kubenswrapper[4585]: E0215 17:59:08.560956 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerName="extract-content" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.560961 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerName="extract-content" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.561162 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="124fe4ca-8f5e-4405-97c5-edf3963a1534" containerName="registry-server" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.561176 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" containerName="copy" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.561186 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba76fcf9-3d5b-40db-94ef-09a5c53e75a6" containerName="gather" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.563703 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.582787 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ghnhq"] Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.695412 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-catalog-content\") pod \"certified-operators-ghnhq\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.695506 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-utilities\") pod \"certified-operators-ghnhq\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.696281 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9tgz\" (UniqueName: \"kubernetes.io/projected/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-kube-api-access-p9tgz\") pod \"certified-operators-ghnhq\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.797891 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-catalog-content\") pod \"certified-operators-ghnhq\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.797994 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-utilities\") pod \"certified-operators-ghnhq\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.798107 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9tgz\" (UniqueName: \"kubernetes.io/projected/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-kube-api-access-p9tgz\") pod \"certified-operators-ghnhq\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.798426 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-catalog-content\") pod \"certified-operators-ghnhq\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.798510 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-utilities\") pod \"certified-operators-ghnhq\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.820367 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9tgz\" (UniqueName: \"kubernetes.io/projected/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-kube-api-access-p9tgz\") pod \"certified-operators-ghnhq\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:08 crc kubenswrapper[4585]: I0215 17:59:08.896712 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:09 crc kubenswrapper[4585]: I0215 17:59:09.487515 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ghnhq"] Feb 15 17:59:10 crc kubenswrapper[4585]: I0215 17:59:10.455201 4585 generic.go:334] "Generic (PLEG): container finished" podID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerID="805fb3e6e01caf6d222e19f3869f4c8989c6ea1ecd3695f502ec037bce6163e6" exitCode=0 Feb 15 17:59:10 crc kubenswrapper[4585]: I0215 17:59:10.455419 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghnhq" event={"ID":"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99","Type":"ContainerDied","Data":"805fb3e6e01caf6d222e19f3869f4c8989c6ea1ecd3695f502ec037bce6163e6"} Feb 15 17:59:10 crc kubenswrapper[4585]: I0215 17:59:10.455639 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghnhq" event={"ID":"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99","Type":"ContainerStarted","Data":"91bf8ff69aa2e6df45da9144c123e86447bdbc97c68739bb7cf32c7dfcb0fd2f"} Feb 15 17:59:10 crc kubenswrapper[4585]: I0215 17:59:10.459196 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 15 17:59:11 crc kubenswrapper[4585]: I0215 17:59:11.472556 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghnhq" event={"ID":"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99","Type":"ContainerStarted","Data":"2351994d219aaa65e89ba07a4ebc8bb9df669538fd799120f106963b6220a3df"} Feb 15 17:59:13 crc kubenswrapper[4585]: I0215 17:59:13.492456 4585 generic.go:334] "Generic (PLEG): container finished" podID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerID="2351994d219aaa65e89ba07a4ebc8bb9df669538fd799120f106963b6220a3df" exitCode=0 Feb 15 17:59:13 crc kubenswrapper[4585]: I0215 17:59:13.492924 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghnhq" event={"ID":"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99","Type":"ContainerDied","Data":"2351994d219aaa65e89ba07a4ebc8bb9df669538fd799120f106963b6220a3df"} Feb 15 17:59:14 crc kubenswrapper[4585]: I0215 17:59:14.503660 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghnhq" event={"ID":"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99","Type":"ContainerStarted","Data":"4f8ba974b71a88319018373be6662541cfd8c23d17839875ad3ecd9c0574ece8"} Feb 15 17:59:14 crc kubenswrapper[4585]: I0215 17:59:14.529746 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ghnhq" podStartSLOduration=2.809142827 podStartE2EDuration="6.529724423s" podCreationTimestamp="2026-02-15 17:59:08 +0000 UTC" firstStartedPulling="2026-02-15 17:59:10.458965868 +0000 UTC m=+3206.402374000" lastFinishedPulling="2026-02-15 17:59:14.179547464 +0000 UTC m=+3210.122955596" observedRunningTime="2026-02-15 17:59:14.519801266 +0000 UTC m=+3210.463209408" watchObservedRunningTime="2026-02-15 17:59:14.529724423 +0000 UTC m=+3210.473132565" Feb 15 17:59:18 crc kubenswrapper[4585]: I0215 17:59:18.897663 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:18 crc kubenswrapper[4585]: I0215 17:59:18.898104 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:18 crc kubenswrapper[4585]: I0215 17:59:18.960403 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:19 crc kubenswrapper[4585]: I0215 17:59:19.679227 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:19 crc kubenswrapper[4585]: I0215 17:59:19.761787 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ghnhq"] Feb 15 17:59:21 crc kubenswrapper[4585]: I0215 17:59:21.607195 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ghnhq" podUID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerName="registry-server" containerID="cri-o://4f8ba974b71a88319018373be6662541cfd8c23d17839875ad3ecd9c0574ece8" gracePeriod=2 Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.643590 4585 generic.go:334] "Generic (PLEG): container finished" podID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerID="4f8ba974b71a88319018373be6662541cfd8c23d17839875ad3ecd9c0574ece8" exitCode=0 Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.644136 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghnhq" event={"ID":"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99","Type":"ContainerDied","Data":"4f8ba974b71a88319018373be6662541cfd8c23d17839875ad3ecd9c0574ece8"} Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.644166 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghnhq" event={"ID":"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99","Type":"ContainerDied","Data":"91bf8ff69aa2e6df45da9144c123e86447bdbc97c68739bb7cf32c7dfcb0fd2f"} Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.644178 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91bf8ff69aa2e6df45da9144c123e86447bdbc97c68739bb7cf32c7dfcb0fd2f" Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.653518 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.747130 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9tgz\" (UniqueName: \"kubernetes.io/projected/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-kube-api-access-p9tgz\") pod \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.747238 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-catalog-content\") pod \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.747321 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-utilities\") pod \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\" (UID: \"0bdc90e8-e18a-41bb-9ef4-bd12f6402b99\") " Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.748315 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-utilities" (OuterVolumeSpecName: "utilities") pod "0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" (UID: "0bdc90e8-e18a-41bb-9ef4-bd12f6402b99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.752738 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-kube-api-access-p9tgz" (OuterVolumeSpecName: "kube-api-access-p9tgz") pod "0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" (UID: "0bdc90e8-e18a-41bb-9ef4-bd12f6402b99"). InnerVolumeSpecName "kube-api-access-p9tgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.797338 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" (UID: "0bdc90e8-e18a-41bb-9ef4-bd12f6402b99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.849368 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9tgz\" (UniqueName: \"kubernetes.io/projected/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-kube-api-access-p9tgz\") on node \"crc\" DevicePath \"\"" Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.849403 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 15 17:59:22 crc kubenswrapper[4585]: I0215 17:59:22.849415 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99-utilities\") on node \"crc\" DevicePath \"\"" Feb 15 17:59:23 crc kubenswrapper[4585]: I0215 17:59:23.652249 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ghnhq" Feb 15 17:59:23 crc kubenswrapper[4585]: I0215 17:59:23.675416 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ghnhq"] Feb 15 17:59:23 crc kubenswrapper[4585]: I0215 17:59:23.688098 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ghnhq"] Feb 15 17:59:24 crc kubenswrapper[4585]: I0215 17:59:24.903534 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" path="/var/lib/kubelet/pods/0bdc90e8-e18a-41bb-9ef4-bd12f6402b99/volumes" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.163582 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz"] Feb 15 18:00:00 crc kubenswrapper[4585]: E0215 18:00:00.164571 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerName="extract-utilities" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.164584 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerName="extract-utilities" Feb 15 18:00:00 crc kubenswrapper[4585]: E0215 18:00:00.164633 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerName="extract-content" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.164639 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerName="extract-content" Feb 15 18:00:00 crc kubenswrapper[4585]: E0215 18:00:00.164651 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerName="registry-server" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.164658 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerName="registry-server" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.164883 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bdc90e8-e18a-41bb-9ef4-bd12f6402b99" containerName="registry-server" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.165514 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.169806 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.170403 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.185687 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz"] Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.276899 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-config-volume\") pod \"collect-profiles-29519640-nhtfz\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.276972 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548l4\" (UniqueName: \"kubernetes.io/projected/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-kube-api-access-548l4\") pod \"collect-profiles-29519640-nhtfz\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.277217 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-secret-volume\") pod \"collect-profiles-29519640-nhtfz\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.382768 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-config-volume\") pod \"collect-profiles-29519640-nhtfz\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.381169 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-config-volume\") pod \"collect-profiles-29519640-nhtfz\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.383325 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548l4\" (UniqueName: \"kubernetes.io/projected/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-kube-api-access-548l4\") pod \"collect-profiles-29519640-nhtfz\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.384348 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-secret-volume\") pod \"collect-profiles-29519640-nhtfz\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.396540 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-secret-volume\") pod \"collect-profiles-29519640-nhtfz\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.404249 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548l4\" (UniqueName: \"kubernetes.io/projected/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-kube-api-access-548l4\") pod \"collect-profiles-29519640-nhtfz\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.485317 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:00 crc kubenswrapper[4585]: I0215 18:00:00.990838 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz"] Feb 15 18:00:01 crc kubenswrapper[4585]: I0215 18:00:01.091876 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" event={"ID":"f18861c9-a905-41c5-9dcd-8c0fe53ee64c","Type":"ContainerStarted","Data":"2cd0b58554d90b6cda916cfe431a955d066c2e6f6d22925d0ebbc324b3cec7fb"} Feb 15 18:00:02 crc kubenswrapper[4585]: I0215 18:00:02.106141 4585 generic.go:334] "Generic (PLEG): container finished" podID="f18861c9-a905-41c5-9dcd-8c0fe53ee64c" containerID="46547823591c547a814b411095f48b2af9fd7c180f4d1cc4b71833b4d1176f68" exitCode=0 Feb 15 18:00:02 crc kubenswrapper[4585]: I0215 18:00:02.106209 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" event={"ID":"f18861c9-a905-41c5-9dcd-8c0fe53ee64c","Type":"ContainerDied","Data":"46547823591c547a814b411095f48b2af9fd7c180f4d1cc4b71833b4d1176f68"} Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.762709 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.868994 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-548l4\" (UniqueName: \"kubernetes.io/projected/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-kube-api-access-548l4\") pod \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.869068 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-config-volume\") pod \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.869242 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-secret-volume\") pod \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\" (UID: \"f18861c9-a905-41c5-9dcd-8c0fe53ee64c\") " Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.869811 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-config-volume" (OuterVolumeSpecName: "config-volume") pod "f18861c9-a905-41c5-9dcd-8c0fe53ee64c" (UID: "f18861c9-a905-41c5-9dcd-8c0fe53ee64c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.869960 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.874269 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-kube-api-access-548l4" (OuterVolumeSpecName: "kube-api-access-548l4") pod "f18861c9-a905-41c5-9dcd-8c0fe53ee64c" (UID: "f18861c9-a905-41c5-9dcd-8c0fe53ee64c"). InnerVolumeSpecName "kube-api-access-548l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.875371 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f18861c9-a905-41c5-9dcd-8c0fe53ee64c" (UID: "f18861c9-a905-41c5-9dcd-8c0fe53ee64c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.972263 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-548l4\" (UniqueName: \"kubernetes.io/projected/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-kube-api-access-548l4\") on node \"crc\" DevicePath \"\"" Feb 15 18:00:03 crc kubenswrapper[4585]: I0215 18:00:03.972297 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f18861c9-a905-41c5-9dcd-8c0fe53ee64c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 15 18:00:04 crc kubenswrapper[4585]: I0215 18:00:04.125811 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" event={"ID":"f18861c9-a905-41c5-9dcd-8c0fe53ee64c","Type":"ContainerDied","Data":"2cd0b58554d90b6cda916cfe431a955d066c2e6f6d22925d0ebbc324b3cec7fb"} Feb 15 18:00:04 crc kubenswrapper[4585]: I0215 18:00:04.125855 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd0b58554d90b6cda916cfe431a955d066c2e6f6d22925d0ebbc324b3cec7fb" Feb 15 18:00:04 crc kubenswrapper[4585]: I0215 18:00:04.125916 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29519640-nhtfz" Feb 15 18:00:04 crc kubenswrapper[4585]: I0215 18:00:04.864183 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf"] Feb 15 18:00:04 crc kubenswrapper[4585]: I0215 18:00:04.864219 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29519595-b2ljf"] Feb 15 18:00:06 crc kubenswrapper[4585]: I0215 18:00:06.913889 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11020ade-99c4-480e-bb24-5e0051a2b870" path="/var/lib/kubelet/pods/11020ade-99c4-480e-bb24-5e0051a2b870/volumes" Feb 15 18:00:51 crc kubenswrapper[4585]: I0215 18:00:51.261002 4585 scope.go:117] "RemoveContainer" containerID="37de8d54a840b44884ea91565c0ba9da0456ddade5f48def5a0215d0fa231949" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.170012 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29519641-c8zj7"] Feb 15 18:01:00 crc kubenswrapper[4585]: E0215 18:01:00.171424 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18861c9-a905-41c5-9dcd-8c0fe53ee64c" containerName="collect-profiles" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.171448 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18861c9-a905-41c5-9dcd-8c0fe53ee64c" containerName="collect-profiles" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.171921 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18861c9-a905-41c5-9dcd-8c0fe53ee64c" containerName="collect-profiles" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.172801 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.189908 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29519641-c8zj7"] Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.365791 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-fernet-keys\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.366292 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7d72\" (UniqueName: \"kubernetes.io/projected/32e6efaa-9b53-4312-bd38-92497fe38b6b-kube-api-access-x7d72\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.366436 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-combined-ca-bundle\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.366555 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-config-data\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.468430 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7d72\" (UniqueName: \"kubernetes.io/projected/32e6efaa-9b53-4312-bd38-92497fe38b6b-kube-api-access-x7d72\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.468728 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-combined-ca-bundle\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.468873 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-config-data\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.468988 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-fernet-keys\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.476360 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-config-data\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.478920 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-fernet-keys\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.484935 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-combined-ca-bundle\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.501039 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7d72\" (UniqueName: \"kubernetes.io/projected/32e6efaa-9b53-4312-bd38-92497fe38b6b-kube-api-access-x7d72\") pod \"keystone-cron-29519641-c8zj7\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:00 crc kubenswrapper[4585]: I0215 18:01:00.797729 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:01 crc kubenswrapper[4585]: I0215 18:01:01.330842 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29519641-c8zj7"] Feb 15 18:01:01 crc kubenswrapper[4585]: I0215 18:01:01.898611 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29519641-c8zj7" event={"ID":"32e6efaa-9b53-4312-bd38-92497fe38b6b","Type":"ContainerStarted","Data":"fd7155798442819403fe270763c4b763d6bdb40d7c86f9450332461727ccbae6"} Feb 15 18:01:01 crc kubenswrapper[4585]: I0215 18:01:01.898866 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29519641-c8zj7" event={"ID":"32e6efaa-9b53-4312-bd38-92497fe38b6b","Type":"ContainerStarted","Data":"c1382449d1d65eef95c6d6b68b121cad40f51bdd73ff26c4528e472f6f7ca0b7"} Feb 15 18:01:01 crc kubenswrapper[4585]: I0215 18:01:01.933359 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29519641-c8zj7" podStartSLOduration=1.933343383 podStartE2EDuration="1.933343383s" podCreationTimestamp="2026-02-15 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-15 18:01:01.929742846 +0000 UTC m=+3317.873150978" watchObservedRunningTime="2026-02-15 18:01:01.933343383 +0000 UTC m=+3317.876751515" Feb 15 18:01:04 crc kubenswrapper[4585]: I0215 18:01:04.936871 4585 generic.go:334] "Generic (PLEG): container finished" podID="32e6efaa-9b53-4312-bd38-92497fe38b6b" containerID="fd7155798442819403fe270763c4b763d6bdb40d7c86f9450332461727ccbae6" exitCode=0 Feb 15 18:01:04 crc kubenswrapper[4585]: I0215 18:01:04.936979 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29519641-c8zj7" event={"ID":"32e6efaa-9b53-4312-bd38-92497fe38b6b","Type":"ContainerDied","Data":"fd7155798442819403fe270763c4b763d6bdb40d7c86f9450332461727ccbae6"} Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.307917 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.436091 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-combined-ca-bundle\") pod \"32e6efaa-9b53-4312-bd38-92497fe38b6b\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.436265 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-config-data\") pod \"32e6efaa-9b53-4312-bd38-92497fe38b6b\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.436358 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-fernet-keys\") pod \"32e6efaa-9b53-4312-bd38-92497fe38b6b\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.436383 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7d72\" (UniqueName: \"kubernetes.io/projected/32e6efaa-9b53-4312-bd38-92497fe38b6b-kube-api-access-x7d72\") pod \"32e6efaa-9b53-4312-bd38-92497fe38b6b\" (UID: \"32e6efaa-9b53-4312-bd38-92497fe38b6b\") " Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.452662 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e6efaa-9b53-4312-bd38-92497fe38b6b-kube-api-access-x7d72" (OuterVolumeSpecName: "kube-api-access-x7d72") pod "32e6efaa-9b53-4312-bd38-92497fe38b6b" (UID: "32e6efaa-9b53-4312-bd38-92497fe38b6b"). InnerVolumeSpecName "kube-api-access-x7d72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.452702 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "32e6efaa-9b53-4312-bd38-92497fe38b6b" (UID: "32e6efaa-9b53-4312-bd38-92497fe38b6b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.473523 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32e6efaa-9b53-4312-bd38-92497fe38b6b" (UID: "32e6efaa-9b53-4312-bd38-92497fe38b6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.505799 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-config-data" (OuterVolumeSpecName: "config-data") pod "32e6efaa-9b53-4312-bd38-92497fe38b6b" (UID: "32e6efaa-9b53-4312-bd38-92497fe38b6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.539454 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.539491 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-config-data\") on node \"crc\" DevicePath \"\"" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.539504 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7d72\" (UniqueName: \"kubernetes.io/projected/32e6efaa-9b53-4312-bd38-92497fe38b6b-kube-api-access-x7d72\") on node \"crc\" DevicePath \"\"" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.539518 4585 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32e6efaa-9b53-4312-bd38-92497fe38b6b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.968031 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29519641-c8zj7" event={"ID":"32e6efaa-9b53-4312-bd38-92497fe38b6b","Type":"ContainerDied","Data":"c1382449d1d65eef95c6d6b68b121cad40f51bdd73ff26c4528e472f6f7ca0b7"} Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.968342 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1382449d1d65eef95c6d6b68b121cad40f51bdd73ff26c4528e472f6f7ca0b7" Feb 15 18:01:06 crc kubenswrapper[4585]: I0215 18:01:06.968407 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29519641-c8zj7" Feb 15 18:01:17 crc kubenswrapper[4585]: I0215 18:01:17.013893 4585 patch_prober.go:28] interesting pod/machine-config-daemon-4hptv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 15 18:01:17 crc kubenswrapper[4585]: I0215 18:01:17.014301 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hptv" podUID="0c41aeb2-e722-4379-b7d6-fe499719f9d2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"